Feb 27 06:10:20 crc systemd[1]: Starting Kubernetes Kubelet... Feb 27 06:10:20 crc restorecon[4700]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:20 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 06:10:21 crc restorecon[4700]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 27 06:10:21 crc kubenswrapper[4725]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 06:10:21 crc kubenswrapper[4725]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 27 06:10:21 crc kubenswrapper[4725]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 06:10:21 crc kubenswrapper[4725]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 06:10:21 crc kubenswrapper[4725]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 27 06:10:21 crc kubenswrapper[4725]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.979795 4725 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986180 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986213 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986223 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986233 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986242 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986250 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986259 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986269 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986278 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986314 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986322 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986330 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986338 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986346 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986354 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986361 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986370 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986377 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986385 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986396 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986406 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986416 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986432 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986442 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986450 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986458 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986469 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986480 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986488 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986496 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986506 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986514 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986524 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986532 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986540 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986549 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986558 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986566 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986574 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986583 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986594 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986603 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986613 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986622 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986631 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986642 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986652 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986661 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986670 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986678 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986686 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986716 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986725 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986733 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986740 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986748 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986758 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986765 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986773 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986781 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986789 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986797 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986804 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986812 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986820 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986828 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986835 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986843 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986851 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986859 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.986866 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987567 4725 flags.go:64] FLAG: --address="0.0.0.0" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987590 4725 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987606 4725 flags.go:64] FLAG: --anonymous-auth="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987618 4725 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987629 4725 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987639 4725 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987650 4725 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987666 4725 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987676 4725 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987685 4725 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987695 4725 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987704 4725 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987713 4725 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987722 4725 flags.go:64] FLAG: --cgroup-root="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987731 4725 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987740 4725 flags.go:64] FLAG: --client-ca-file="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987749 4725 flags.go:64] FLAG: --cloud-config="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987757 4725 flags.go:64] FLAG: --cloud-provider="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987766 4725 flags.go:64] FLAG: --cluster-dns="[]" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987776 4725 flags.go:64] FLAG: --cluster-domain="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987785 4725 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987794 4725 flags.go:64] FLAG: --config-dir="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987803 4725 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987812 4725 flags.go:64] FLAG: --container-log-max-files="5" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987823 4725 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987832 4725 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987841 4725 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987850 4725 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987860 4725 flags.go:64] FLAG: --contention-profiling="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987870 4725 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987879 4725 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987888 4725 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987897 4725 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987908 4725 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987917 4725 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987926 4725 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987935 4725 flags.go:64] FLAG: --enable-load-reader="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987946 4725 flags.go:64] FLAG: --enable-server="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987956 4725 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987968 4725 flags.go:64] FLAG: --event-burst="100" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987977 4725 flags.go:64] FLAG: --event-qps="50" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987987 4725 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.987996 4725 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988005 4725 flags.go:64] FLAG: --eviction-hard="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988016 4725 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988025 4725 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988034 4725 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988044 4725 flags.go:64] FLAG: --eviction-soft="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988052 4725 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988061 4725 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988070 4725 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988079 4725 flags.go:64] FLAG: --experimental-mounter-path="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988088 4725 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988097 4725 flags.go:64] FLAG: --fail-swap-on="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988106 4725 flags.go:64] FLAG: --feature-gates="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988117 4725 flags.go:64] FLAG: --file-check-frequency="20s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988125 4725 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988134 4725 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988143 4725 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988152 4725 flags.go:64] FLAG: --healthz-port="10248" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988162 4725 flags.go:64] FLAG: --help="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988171 4725 flags.go:64] FLAG: --hostname-override="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988180 4725 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988189 4725 flags.go:64] FLAG: --http-check-frequency="20s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988197 4725 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988206 4725 flags.go:64] FLAG: --image-credential-provider-config="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988215 4725 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988223 4725 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988232 4725 flags.go:64] FLAG: --image-service-endpoint="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988241 4725 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988250 4725 flags.go:64] FLAG: --kube-api-burst="100" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988259 4725 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988268 4725 flags.go:64] FLAG: --kube-api-qps="50" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988278 4725 flags.go:64] FLAG: --kube-reserved="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988330 4725 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988340 4725 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988349 4725 flags.go:64] FLAG: --kubelet-cgroups="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988358 4725 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988367 4725 flags.go:64] FLAG: --lock-file="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988376 4725 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988385 4725 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988394 4725 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988408 4725 flags.go:64] FLAG: --log-json-split-stream="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988417 4725 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988425 4725 flags.go:64] FLAG: --log-text-split-stream="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988434 4725 flags.go:64] FLAG: --logging-format="text" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988443 4725 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988453 4725 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988462 4725 flags.go:64] FLAG: --manifest-url="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988470 4725 flags.go:64] FLAG: --manifest-url-header="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988481 4725 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988491 4725 flags.go:64] FLAG: --max-open-files="1000000" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988502 4725 flags.go:64] FLAG: --max-pods="110" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988511 4725 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988520 4725 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988529 4725 flags.go:64] FLAG: --memory-manager-policy="None" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988538 4725 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988547 4725 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988556 4725 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988565 4725 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988584 4725 flags.go:64] FLAG: --node-status-max-images="50" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988593 4725 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988602 4725 flags.go:64] FLAG: --oom-score-adj="-999" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988612 4725 flags.go:64] FLAG: --pod-cidr="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988620 4725 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988634 4725 flags.go:64] FLAG: --pod-manifest-path="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988643 4725 flags.go:64] FLAG: --pod-max-pids="-1" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988652 4725 flags.go:64] FLAG: --pods-per-core="0" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988661 4725 flags.go:64] FLAG: --port="10250" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988671 4725 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988679 4725 flags.go:64] FLAG: --provider-id="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988688 4725 flags.go:64] FLAG: --qos-reserved="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988697 4725 flags.go:64] FLAG: --read-only-port="10255" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988706 4725 flags.go:64] FLAG: --register-node="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988715 4725 flags.go:64] FLAG: --register-schedulable="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988723 4725 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988737 4725 flags.go:64] FLAG: --registry-burst="10" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988746 4725 flags.go:64] FLAG: --registry-qps="5" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988755 4725 flags.go:64] FLAG: --reserved-cpus="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988764 4725 flags.go:64] FLAG: --reserved-memory="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988775 4725 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988785 4725 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988794 4725 flags.go:64] FLAG: --rotate-certificates="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988804 4725 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988813 4725 flags.go:64] FLAG: --runonce="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988822 4725 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988832 4725 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988841 4725 flags.go:64] FLAG: --seccomp-default="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988850 4725 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988860 4725 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988869 4725 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988879 4725 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988888 4725 flags.go:64] FLAG: --storage-driver-password="root" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988898 4725 flags.go:64] FLAG: --storage-driver-secure="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988907 4725 flags.go:64] FLAG: --storage-driver-table="stats" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988916 4725 flags.go:64] FLAG: --storage-driver-user="root" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988926 4725 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988935 4725 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988944 4725 flags.go:64] FLAG: --system-cgroups="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988953 4725 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988967 4725 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988976 4725 flags.go:64] FLAG: --tls-cert-file="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988984 4725 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.988996 4725 flags.go:64] FLAG: --tls-min-version="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989005 4725 flags.go:64] FLAG: --tls-private-key-file="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989014 4725 flags.go:64] FLAG: --topology-manager-policy="none" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989024 4725 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989033 4725 flags.go:64] FLAG: --topology-manager-scope="container" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989041 4725 flags.go:64] FLAG: --v="2" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989053 4725 flags.go:64] FLAG: --version="false" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989064 4725 flags.go:64] FLAG: --vmodule="" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989074 4725 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 27 06:10:21 crc kubenswrapper[4725]: I0227 06:10:21.989084 4725 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989314 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989325 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989336 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989346 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989355 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989363 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989372 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989380 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989387 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989403 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989410 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989418 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989426 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989434 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989442 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989450 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989458 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989468 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989478 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989486 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989494 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989504 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989513 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989521 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989528 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989536 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989544 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989552 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 06:10:21 crc kubenswrapper[4725]: W0227 06:10:21.989560 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989568 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989576 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989584 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989592 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989600 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989608 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989615 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989623 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989631 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989639 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989646 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989654 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989664 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989671 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989680 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989687 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989695 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989702 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989710 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989718 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989725 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989734 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989741 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989748 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989757 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989764 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989772 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989779 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989787 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989794 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989827 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989840 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989850 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989859 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989867 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989877 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989885 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989893 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989901 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989908 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989919 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:21.989929 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:21.989952 4725 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.002503 4725 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.002551 4725 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002690 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002715 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002725 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002733 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002744 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002752 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002761 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002770 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002779 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002823 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002831 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002842 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002852 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002860 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002869 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002877 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002884 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002892 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002900 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002910 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002919 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002927 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002934 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002942 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002950 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002958 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002966 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002974 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002982 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002990 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.002998 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003006 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003013 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003021 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003028 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003037 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003045 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003054 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003061 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003069 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003077 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003085 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003093 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003101 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003109 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003119 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003128 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003143 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003158 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003170 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003180 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003190 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003200 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003211 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003225 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003237 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003248 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003259 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003267 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003275 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003283 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003320 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003328 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003336 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003344 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003351 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003359 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003367 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003375 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003383 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003391 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.003405 4725 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003646 4725 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003659 4725 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003670 4725 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003679 4725 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003687 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003696 4725 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003704 4725 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003713 4725 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003721 4725 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003729 4725 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003738 4725 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003746 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003755 4725 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003763 4725 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003771 4725 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003779 4725 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003788 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003797 4725 feature_gate.go:330] unrecognized feature gate: Example Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003805 4725 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003817 4725 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003828 4725 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003837 4725 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003845 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003854 4725 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003863 4725 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003872 4725 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003880 4725 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003888 4725 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003897 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003905 4725 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003914 4725 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003922 4725 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003930 4725 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003937 4725 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003945 4725 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003953 4725 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003961 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003969 4725 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003977 4725 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003984 4725 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.003992 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004000 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004008 4725 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004015 4725 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004023 4725 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004031 4725 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004039 4725 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004049 4725 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004058 4725 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004068 4725 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004077 4725 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004085 4725 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004093 4725 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004101 4725 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004109 4725 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004118 4725 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004126 4725 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004134 4725 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004142 4725 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004149 4725 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004157 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004166 4725 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004176 4725 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004186 4725 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004195 4725 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004206 4725 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004215 4725 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004226 4725 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004239 4725 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004251 4725 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.004264 4725 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.004279 4725 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.004583 4725 server.go:940] "Client rotation is on, will bootstrap in background" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.009859 4725 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.014811 4725 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.014975 4725 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.016983 4725 server.go:997] "Starting client certificate rotation" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.017060 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.017529 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.044949 4725 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.047993 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.049162 4725 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.070776 4725 log.go:25] "Validated CRI v1 runtime API" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.109409 4725 log.go:25] "Validated CRI v1 image API" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.111675 4725 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.119265 4725 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-27-06-05-10-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.119659 4725 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.149519 4725 manager.go:217] Machine: {Timestamp:2026-02-27 06:10:22.144643929 +0000 UTC m=+0.607264538 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8cd3fff4-1c99-4289-9cf4-2c947cb81dcd BootID:d597123e-4f5a-4643-94b9-026053817d04 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ac:40:3c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ac:40:3c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b9:a9:94 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9c:5e:8d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:df:7d:f1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:89:6a:71 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ca:df:b3:73:08:92 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ea:9b:c1:dc:81:c2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.149794 4725 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.149989 4725 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.150919 4725 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.151106 4725 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.151142 4725 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.151415 4725 topology_manager.go:138] "Creating topology manager with none policy" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.151428 4725 container_manager_linux.go:303] "Creating device plugin manager" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.152009 4725 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.152047 4725 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.152186 4725 state_mem.go:36] "Initialized new in-memory state store" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.152276 4725 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.156472 4725 kubelet.go:418] "Attempting to sync node with API server" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.156495 4725 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.156513 4725 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.156526 4725 kubelet.go:324] "Adding apiserver pod source" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.156540 4725 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.161422 4725 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.162558 4725 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.164940 4725 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.165379 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.165413 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.165484 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.165510 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166326 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166354 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166363 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166372 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166387 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166396 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166405 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166426 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166438 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166447 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166458 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.166468 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.168796 4725 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.169251 4725 server.go:1280] "Started kubelet" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.172527 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.173239 4725 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 27 06:10:22 crc systemd[1]: Started Kubernetes Kubelet. Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.174363 4725 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.175170 4725 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.180068 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.180174 4725 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.180777 4725 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.180822 4725 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.180798 4725 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.182031 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.185382 4725 factory.go:55] Registering systemd factory Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.186490 4725 factory.go:221] Registration of the systemd container factory successfully Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.186112 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.186916 4725 factory.go:153] Registering CRI-O factory Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.187080 4725 factory.go:221] Registration of the crio container factory successfully Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.187384 4725 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.187548 4725 factory.go:103] Registering Raw factory Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.187669 4725 manager.go:1196] Started watching for new ooms in manager Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.186955 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.188014 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.188610 4725 server.go:460] "Adding debug handlers to kubelet server" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.190345 4725 manager.go:319] Starting recovery of all containers Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.189803 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1898059c4911bb22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,LastTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196394 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196456 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196482 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196501 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196527 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196545 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196563 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196582 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196603 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196620 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196640 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196664 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196689 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196716 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196740 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196758 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196780 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196796 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196818 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196836 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196891 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196909 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196928 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196949 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196966 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.196991 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197019 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197039 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197058 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197076 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197096 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197122 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197146 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197168 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197186 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197206 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197225 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197242 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197261 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197278 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197329 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197348 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197365 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197422 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197441 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197461 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197480 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197498 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197516 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197534 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197553 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197571 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197596 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197616 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197636 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197654 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197673 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197691 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197710 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197726 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197743 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197760 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197778 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197797 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197813 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197831 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197849 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197865 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197886 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197905 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197923 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197943 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197960 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197979 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.197996 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198014 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198031 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198048 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198066 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198085 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198101 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198118 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198136 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198154 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198173 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198190 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198207 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198226 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.198245 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200651 4725 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200704 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200727 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200746 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200765 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200786 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200804 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200821 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200838 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200857 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200875 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200893 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200910 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200927 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200947 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200964 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.200990 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201008 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201027 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201046 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201065 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201086 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201105 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201124 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201160 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201179 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201198 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201262 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201325 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201347 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201366 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201385 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201405 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201423 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201442 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201461 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201479 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201497 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201514 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201531 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201549 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201578 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201596 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201614 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201634 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201654 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201671 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201688 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201706 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201724 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201742 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201760 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201778 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201794 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201811 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201829 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201847 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201864 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201880 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201899 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201917 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201934 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201951 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201978 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.201995 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202022 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202040 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202159 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202178 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202202 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202220 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202237 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202262 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202312 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202331 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202383 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202424 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202441 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202460 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202478 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202495 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202551 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202604 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202643 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202660 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202678 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202694 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202710 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202728 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202745 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202763 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202804 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202838 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202854 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202870 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202888 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202905 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202922 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202939 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202977 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.202994 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203010 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203028 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203044 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203061 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203138 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203157 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203367 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203448 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203493 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203589 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203608 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203624 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203739 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203765 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203822 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203892 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203957 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203976 4725 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.203994 4725 reconstruct.go:97] "Volume reconstruction finished" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.204008 4725 reconciler.go:26] "Reconciler: start to sync state" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.229640 4725 manager.go:324] Recovery completed Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.243902 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.245795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.245833 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.245844 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.246555 4725 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.246577 4725 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.246596 4725 state_mem.go:36] "Initialized new in-memory state store" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.246706 4725 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.249360 4725 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.250174 4725 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.250217 4725 kubelet.go:2335] "Starting kubelet main sync loop" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.250264 4725 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.251619 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.251699 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.262746 4725 policy_none.go:49] "None policy: Start" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.264123 4725 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.264197 4725 state_mem.go:35] "Initializing new in-memory state store" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.282422 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.350351 4725 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.376877 4725 manager.go:334] "Starting Device Plugin manager" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.376956 4725 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.376975 4725 server.go:79] "Starting device plugin registration server" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.377556 4725 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.377661 4725 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.377891 4725 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.378081 4725 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.378091 4725 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.385468 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.388165 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.479349 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.480956 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.481002 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.481025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.481064 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.481652 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.551739 4725 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.551865 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.553663 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.553740 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.553760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.553939 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.554621 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.554825 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.555077 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.555122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.555139 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.555333 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.555491 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.555553 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.560371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.560433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.560454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.562558 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.562565 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.562626 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.562653 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.562666 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.562692 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.562875 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.563103 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.563184 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.564821 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.564864 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.564907 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.564928 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.564871 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.565051 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.565125 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.565397 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.565451 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.566567 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.566613 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.566640 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.566813 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.566857 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.566882 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.567227 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.567335 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.568629 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.568690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.568715 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608487 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608532 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608570 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608598 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608629 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608656 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608685 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608712 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608745 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608771 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608863 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608931 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.608982 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.609016 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.609069 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.681888 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.683484 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.683524 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.683542 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.683577 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.684206 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710465 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710555 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710606 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710638 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710666 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710695 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710705 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710724 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710755 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710786 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710815 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710842 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710840 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710870 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710897 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710926 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710937 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710954 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710988 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711037 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711086 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711131 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711178 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711226 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711271 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711373 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.710902 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711432 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711485 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.711512 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: E0227 06:10:22.789197 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.901490 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.914444 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.925156 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.960368 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8aa1f8dadb6529b114091f2cde8a76a98ca39d8f7edf6c0cb1d85e191b125a67 WatchSource:0}: Error finding container 8aa1f8dadb6529b114091f2cde8a76a98ca39d8f7edf6c0cb1d85e191b125a67: Status 404 returned error can't find the container with id 8aa1f8dadb6529b114091f2cde8a76a98ca39d8f7edf6c0cb1d85e191b125a67 Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.966678 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5b80b7dc8351beb7eb11e5d34cbfd1e0c36c288caca16a196b58376bbe38bbfd WatchSource:0}: Error finding container 5b80b7dc8351beb7eb11e5d34cbfd1e0c36c288caca16a196b58376bbe38bbfd: Status 404 returned error can't find the container with id 5b80b7dc8351beb7eb11e5d34cbfd1e0c36c288caca16a196b58376bbe38bbfd Feb 27 06:10:22 crc kubenswrapper[4725]: W0227 06:10:22.972179 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2f8ac34081da87b9b463aa4ae99d6f4435195fb9e91f4f0237583f936473a2b3 WatchSource:0}: Error finding container 2f8ac34081da87b9b463aa4ae99d6f4435195fb9e91f4f0237583f936473a2b3: Status 404 returned error can't find the container with id 2f8ac34081da87b9b463aa4ae99d6f4435195fb9e91f4f0237583f936473a2b3 Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.973347 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:22 crc kubenswrapper[4725]: I0227 06:10:22.987170 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:23 crc kubenswrapper[4725]: W0227 06:10:23.004860 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3d2a056e8e97c11b28264b0d02316e8df2a541c12d19c878b5b80acadcf39929 WatchSource:0}: Error finding container 3d2a056e8e97c11b28264b0d02316e8df2a541c12d19c878b5b80acadcf39929: Status 404 returned error can't find the container with id 3d2a056e8e97c11b28264b0d02316e8df2a541c12d19c878b5b80acadcf39929 Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.085000 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.086815 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.086885 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.086911 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.086957 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:23 crc kubenswrapper[4725]: E0227 06:10:23.089527 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Feb 27 06:10:23 crc kubenswrapper[4725]: W0227 06:10:23.123915 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:23 crc kubenswrapper[4725]: E0227 06:10:23.124041 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.174775 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.254508 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2f8ac34081da87b9b463aa4ae99d6f4435195fb9e91f4f0237583f936473a2b3"} Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.256219 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5b80b7dc8351beb7eb11e5d34cbfd1e0c36c288caca16a196b58376bbe38bbfd"} Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.257801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8aa1f8dadb6529b114091f2cde8a76a98ca39d8f7edf6c0cb1d85e191b125a67"} Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.259204 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8ebeca8c265cd0edd2852f6cadce39efcca55e5e7f6f1457dc4e65007a0daf8a"} Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.260544 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3d2a056e8e97c11b28264b0d02316e8df2a541c12d19c878b5b80acadcf39929"} Feb 27 06:10:23 crc kubenswrapper[4725]: W0227 06:10:23.358388 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:23 crc kubenswrapper[4725]: E0227 06:10:23.358554 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:23 crc kubenswrapper[4725]: W0227 06:10:23.578679 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:23 crc kubenswrapper[4725]: E0227 06:10:23.578869 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:23 crc kubenswrapper[4725]: E0227 06:10:23.590581 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Feb 27 06:10:23 crc kubenswrapper[4725]: W0227 06:10:23.632232 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:23 crc kubenswrapper[4725]: E0227 06:10:23.632398 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.890460 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.892635 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.892691 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.892708 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:23 crc kubenswrapper[4725]: I0227 06:10:23.892740 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:23 crc kubenswrapper[4725]: E0227 06:10:23.893358 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.142050 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 06:10:24 crc kubenswrapper[4725]: E0227 06:10:24.143437 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.174718 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.266623 4725 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7" exitCode=0 Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.266752 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7"} Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.266852 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.268690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.268761 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.268781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.269537 4725 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4" exitCode=0 Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.269657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4"} Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.269826 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.271031 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.271090 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.271116 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.273705 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe"} Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.273756 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5"} Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.273823 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab"} Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.277182 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f" exitCode=0 Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.277344 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f"} Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.277422 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.278851 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.278900 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.278921 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.280838 4725 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328" exitCode=0 Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.280892 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328"} Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.281022 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.285127 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.290684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.290743 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.290768 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.291017 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.291275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:24 crc kubenswrapper[4725]: I0227 06:10:24.291343 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.174471 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:25 crc kubenswrapper[4725]: E0227 06:10:25.191652 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.290549 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.290603 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.290610 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.290699 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.291971 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.292223 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.292233 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.297450 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.297646 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.298833 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.298864 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.298872 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.307058 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.307107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.307117 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.307126 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.309348 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.309466 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.310352 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.310384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.310395 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.311360 4725 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd" exitCode=0 Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.311395 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd"} Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.311472 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.312313 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.312339 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.312350 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.493677 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.494797 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.494831 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.494845 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:25 crc kubenswrapper[4725]: I0227 06:10:25.494869 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:25 crc kubenswrapper[4725]: E0227 06:10:25.495254 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Feb 27 06:10:25 crc kubenswrapper[4725]: W0227 06:10:25.602744 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Feb 27 06:10:25 crc kubenswrapper[4725]: E0227 06:10:25.602813 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.318062 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5d84d39d3369456e2730b9b30965fcced8500f9f1b81eb428c5204e64787f650"} Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.318155 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.319361 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.319384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.319393 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.321793 4725 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb" exitCode=0 Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.321885 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.321951 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.321977 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.322001 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb"} Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.322059 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.322105 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323150 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323175 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323320 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323343 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323580 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323631 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323653 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323601 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323716 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.323735 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:26 crc kubenswrapper[4725]: I0227 06:10:26.596000 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.336752 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936"} Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.336867 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb"} Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.336896 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359"} Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.336902 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.336901 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.338574 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.338623 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.338641 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.339078 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.339149 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.339166 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:27 crc kubenswrapper[4725]: I0227 06:10:27.788099 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.274943 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.313845 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.343057 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808"} Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.343168 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.343161 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71"} Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.343110 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.344351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.344390 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.344405 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.344688 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.344728 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.344744 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.695699 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.697991 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.698059 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.698078 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:28 crc kubenswrapper[4725]: I0227 06:10:28.698134 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.079106 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.215963 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.216245 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.217873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.217918 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.217929 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.220517 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.345912 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.345976 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.346074 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.348512 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.348548 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.348566 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.348689 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.348734 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.348757 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.348989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.349050 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:29 crc kubenswrapper[4725]: I0227 06:10:29.349064 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.349141 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.350430 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.350492 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.350516 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.429038 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.429186 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.429233 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.430706 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.430953 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.431137 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:30 crc kubenswrapper[4725]: I0227 06:10:30.740032 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:31 crc kubenswrapper[4725]: I0227 06:10:31.351413 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:10:31 crc kubenswrapper[4725]: I0227 06:10:31.351488 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:31 crc kubenswrapper[4725]: I0227 06:10:31.352716 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:31 crc kubenswrapper[4725]: I0227 06:10:31.352770 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:31 crc kubenswrapper[4725]: I0227 06:10:31.352787 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:32 crc kubenswrapper[4725]: I0227 06:10:32.017961 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:32 crc kubenswrapper[4725]: I0227 06:10:32.354749 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:32 crc kubenswrapper[4725]: I0227 06:10:32.356016 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:32 crc kubenswrapper[4725]: I0227 06:10:32.356075 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:32 crc kubenswrapper[4725]: I0227 06:10:32.356095 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:32 crc kubenswrapper[4725]: E0227 06:10:32.385586 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:10:33 crc kubenswrapper[4725]: I0227 06:10:33.740476 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 06:10:33 crc kubenswrapper[4725]: I0227 06:10:33.740593 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 06:10:35 crc kubenswrapper[4725]: W0227 06:10:35.847644 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 06:10:35 crc kubenswrapper[4725]: I0227 06:10:35.847804 4725 trace.go:236] Trace[943245934]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 06:10:25.846) (total time: 10001ms): Feb 27 06:10:35 crc kubenswrapper[4725]: Trace[943245934]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (06:10:35.847) Feb 27 06:10:35 crc kubenswrapper[4725]: Trace[943245934]: [10.001124272s] [10.001124272s] END Feb 27 06:10:35 crc kubenswrapper[4725]: E0227 06:10:35.847849 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.175390 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 27 06:10:36 crc kubenswrapper[4725]: W0227 06:10:36.201098 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.201191 4725 trace.go:236] Trace[879063255]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 06:10:26.200) (total time: 10000ms): Feb 27 06:10:36 crc kubenswrapper[4725]: Trace[879063255]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (06:10:36.201) Feb 27 06:10:36 crc kubenswrapper[4725]: Trace[879063255]: [10.000786651s] [10.000786651s] END Feb 27 06:10:36 crc kubenswrapper[4725]: E0227 06:10:36.201214 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.266986 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.267269 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.268701 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.268751 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.268769 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.308592 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.368347 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.370880 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5d84d39d3369456e2730b9b30965fcced8500f9f1b81eb428c5204e64787f650" exitCode=255 Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.370960 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5d84d39d3369456e2730b9b30965fcced8500f9f1b81eb428c5204e64787f650"} Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.371026 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.371273 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.371941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.371982 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.371997 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.372616 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.372638 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.372649 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.373016 4725 scope.go:117] "RemoveContainer" containerID="5d84d39d3369456e2730b9b30965fcced8500f9f1b81eb428c5204e64787f650" Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.393122 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 27 06:10:36 crc kubenswrapper[4725]: W0227 06:10:36.637512 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 06:10:36 crc kubenswrapper[4725]: I0227 06:10:36.637608 4725 trace.go:236] Trace[376780725]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 06:10:26.636) (total time: 10001ms): Feb 27 06:10:36 crc kubenswrapper[4725]: Trace[376780725]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:10:36.637) Feb 27 06:10:36 crc kubenswrapper[4725]: Trace[376780725]: [10.001380709s] [10.001380709s] END Feb 27 06:10:36 crc kubenswrapper[4725]: E0227 06:10:36.637648 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 06:10:37 crc kubenswrapper[4725]: E0227 06:10:37.029135 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:37 crc kubenswrapper[4725]: E0227 06:10:37.032342 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:37Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 27 06:10:37 crc kubenswrapper[4725]: E0227 06:10:37.036016 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:37Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 06:10:37 crc kubenswrapper[4725]: E0227 06:10:37.046616 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:37Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898059c4911bb22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,LastTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:10:37 crc kubenswrapper[4725]: W0227 06:10:37.049938 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:37Z is after 2026-02-23T05:33:13Z Feb 27 06:10:37 crc kubenswrapper[4725]: E0227 06:10:37.050041 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.056991 4725 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.057077 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.065029 4725 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.065098 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.177115 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:37Z is after 2026-02-23T05:33:13Z Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.377278 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.379954 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa"} Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.380113 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.380231 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.381412 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.381464 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.381482 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.382590 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.382668 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.382695 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.802909 4725 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]log ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]etcd ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/generic-apiserver-start-informers ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/priority-and-fairness-filter ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-apiextensions-informers ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-apiextensions-controllers ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/crd-informer-synced ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-system-namespaces-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 27 06:10:37 crc kubenswrapper[4725]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 27 06:10:37 crc kubenswrapper[4725]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/bootstrap-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/start-kube-aggregator-informers ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/apiservice-registration-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/apiservice-discovery-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]autoregister-completion ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/apiservice-openapi-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 27 06:10:37 crc kubenswrapper[4725]: livez check failed Feb 27 06:10:37 crc kubenswrapper[4725]: I0227 06:10:37.803029 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.176991 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:38Z is after 2026-02-23T05:33:13Z Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.385444 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.386276 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.389678 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa" exitCode=255 Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.389736 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa"} Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.389806 4725 scope.go:117] "RemoveContainer" containerID="5d84d39d3369456e2730b9b30965fcced8500f9f1b81eb428c5204e64787f650" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.389991 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.391460 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.391533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.391610 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:38 crc kubenswrapper[4725]: I0227 06:10:38.392755 4725 scope.go:117] "RemoveContainer" containerID="547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa" Feb 27 06:10:38 crc kubenswrapper[4725]: E0227 06:10:38.393046 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:10:39 crc kubenswrapper[4725]: I0227 06:10:39.177647 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:39Z is after 2026-02-23T05:33:13Z Feb 27 06:10:39 crc kubenswrapper[4725]: W0227 06:10:39.351045 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:39Z is after 2026-02-23T05:33:13Z Feb 27 06:10:39 crc kubenswrapper[4725]: E0227 06:10:39.351162 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:39 crc kubenswrapper[4725]: I0227 06:10:39.396132 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 06:10:40 crc kubenswrapper[4725]: I0227 06:10:40.179085 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:40Z is after 2026-02-23T05:33:13Z Feb 27 06:10:40 crc kubenswrapper[4725]: W0227 06:10:40.516022 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:40Z is after 2026-02-23T05:33:13Z Feb 27 06:10:40 crc kubenswrapper[4725]: E0227 06:10:40.516525 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:41 crc kubenswrapper[4725]: I0227 06:10:41.179051 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:41Z is after 2026-02-23T05:33:13Z Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.025614 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.025862 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.027441 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.027510 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.027533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:42 crc kubenswrapper[4725]: W0227 06:10:42.115692 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:42Z is after 2026-02-23T05:33:13Z Feb 27 06:10:42 crc kubenswrapper[4725]: E0227 06:10:42.115807 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.177254 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:42Z is after 2026-02-23T05:33:13Z Feb 27 06:10:42 crc kubenswrapper[4725]: E0227 06:10:42.385706 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.797156 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.797410 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.798837 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.798882 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.798899 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.799698 4725 scope.go:117] "RemoveContainer" containerID="547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa" Feb 27 06:10:42 crc kubenswrapper[4725]: E0227 06:10:42.799988 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:10:42 crc kubenswrapper[4725]: I0227 06:10:42.803703 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.179576 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:43Z is after 2026-02-23T05:33:13Z Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.411342 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.412709 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.412789 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.412808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.413597 4725 scope.go:117] "RemoveContainer" containerID="547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa" Feb 27 06:10:43 crc kubenswrapper[4725]: E0227 06:10:43.413900 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.436450 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:43 crc kubenswrapper[4725]: E0227 06:10:43.437216 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:43Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.437956 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.438003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.438020 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.438077 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:43 crc kubenswrapper[4725]: E0227 06:10:43.442578 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:43Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.740851 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 06:10:43 crc kubenswrapper[4725]: I0227 06:10:43.740934 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 06:10:44 crc kubenswrapper[4725]: I0227 06:10:44.179323 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:44Z is after 2026-02-23T05:33:13Z Feb 27 06:10:45 crc kubenswrapper[4725]: I0227 06:10:45.094618 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 06:10:45 crc kubenswrapper[4725]: E0227 06:10:45.100388 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:45 crc kubenswrapper[4725]: I0227 06:10:45.179548 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:45Z is after 2026-02-23T05:33:13Z Feb 27 06:10:46 crc kubenswrapper[4725]: W0227 06:10:46.060659 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:46Z is after 2026-02-23T05:33:13Z Feb 27 06:10:46 crc kubenswrapper[4725]: E0227 06:10:46.060766 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:46 crc kubenswrapper[4725]: I0227 06:10:46.178654 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:46Z is after 2026-02-23T05:33:13Z Feb 27 06:10:46 crc kubenswrapper[4725]: I0227 06:10:46.596834 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:46 crc kubenswrapper[4725]: I0227 06:10:46.597104 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:46 crc kubenswrapper[4725]: I0227 06:10:46.598661 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:46 crc kubenswrapper[4725]: I0227 06:10:46.598711 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:46 crc kubenswrapper[4725]: I0227 06:10:46.598727 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:46 crc kubenswrapper[4725]: I0227 06:10:46.599572 4725 scope.go:117] "RemoveContainer" containerID="547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa" Feb 27 06:10:46 crc kubenswrapper[4725]: E0227 06:10:46.599850 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:10:47 crc kubenswrapper[4725]: E0227 06:10:47.052927 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:47Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898059c4911bb22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,LastTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:10:47 crc kubenswrapper[4725]: I0227 06:10:47.178804 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:47Z is after 2026-02-23T05:33:13Z Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.108586 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.108881 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.110652 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.110711 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.110737 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.111645 4725 scope.go:117] "RemoveContainer" containerID="547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.179776 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:48Z is after 2026-02-23T05:33:13Z Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.429685 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.433265 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8"} Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.433494 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.434767 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.434815 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:48 crc kubenswrapper[4725]: I0227 06:10:48.434831 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:49 crc kubenswrapper[4725]: I0227 06:10:49.177658 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:49Z is after 2026-02-23T05:33:13Z Feb 27 06:10:49 crc kubenswrapper[4725]: W0227 06:10:49.296061 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:49Z is after 2026-02-23T05:33:13Z Feb 27 06:10:49 crc kubenswrapper[4725]: E0227 06:10:49.296164 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.178801 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:50Z is after 2026-02-23T05:33:13Z Feb 27 06:10:50 crc kubenswrapper[4725]: E0227 06:10:50.441879 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:50Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.442270 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.442788 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.443151 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.444235 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.444331 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.444360 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.444401 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.445966 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8" exitCode=255 Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.446043 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8"} Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.446131 4725 scope.go:117] "RemoveContainer" containerID="547880b23e5cdc2ef36a0755ef952360dd1fa6fade7d92775de9268c15e79baa" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.446365 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.447542 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.447591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.447608 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:50 crc kubenswrapper[4725]: I0227 06:10:50.448662 4725 scope.go:117] "RemoveContainer" containerID="d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8" Feb 27 06:10:50 crc kubenswrapper[4725]: E0227 06:10:50.448944 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:10:50 crc kubenswrapper[4725]: E0227 06:10:50.449556 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:50Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 06:10:50 crc kubenswrapper[4725]: W0227 06:10:50.816160 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:50Z is after 2026-02-23T05:33:13Z Feb 27 06:10:50 crc kubenswrapper[4725]: E0227 06:10:50.816342 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:51 crc kubenswrapper[4725]: I0227 06:10:51.178237 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:51Z is after 2026-02-23T05:33:13Z Feb 27 06:10:51 crc kubenswrapper[4725]: I0227 06:10:51.452431 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 06:10:52 crc kubenswrapper[4725]: I0227 06:10:52.178465 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z Feb 27 06:10:52 crc kubenswrapper[4725]: E0227 06:10:52.385834 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.179172 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:53Z is after 2026-02-23T05:33:13Z Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.740777 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.740878 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.740958 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.741154 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.742781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.742882 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.742904 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.743689 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 27 06:10:53 crc kubenswrapper[4725]: I0227 06:10:53.743951 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5" gracePeriod=30 Feb 27 06:10:53 crc kubenswrapper[4725]: W0227 06:10:53.986384 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:53Z is after 2026-02-23T05:33:13Z Feb 27 06:10:53 crc kubenswrapper[4725]: E0227 06:10:53.986483 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.179838 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:54Z is after 2026-02-23T05:33:13Z Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.467309 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.467695 4725 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5" exitCode=255 Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.467731 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5"} Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.467760 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983"} Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.467853 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.468684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.468749 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:54 crc kubenswrapper[4725]: I0227 06:10:54.468778 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:55 crc kubenswrapper[4725]: I0227 06:10:55.178209 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:55Z is after 2026-02-23T05:33:13Z Feb 27 06:10:56 crc kubenswrapper[4725]: I0227 06:10:56.178345 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:56Z is after 2026-02-23T05:33:13Z Feb 27 06:10:56 crc kubenswrapper[4725]: I0227 06:10:56.596330 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:56 crc kubenswrapper[4725]: I0227 06:10:56.596560 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:56 crc kubenswrapper[4725]: I0227 06:10:56.598167 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:56 crc kubenswrapper[4725]: I0227 06:10:56.598228 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:56 crc kubenswrapper[4725]: I0227 06:10:56.598252 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:56 crc kubenswrapper[4725]: I0227 06:10:56.599240 4725 scope.go:117] "RemoveContainer" containerID="d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8" Feb 27 06:10:56 crc kubenswrapper[4725]: E0227 06:10:56.599610 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:10:57 crc kubenswrapper[4725]: E0227 06:10:57.058690 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:57Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898059c4911bb22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,LastTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:10:57 crc kubenswrapper[4725]: I0227 06:10:57.178152 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:57Z is after 2026-02-23T05:33:13Z Feb 27 06:10:57 crc kubenswrapper[4725]: E0227 06:10:57.447886 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:57Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 06:10:57 crc kubenswrapper[4725]: I0227 06:10:57.450172 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:57 crc kubenswrapper[4725]: I0227 06:10:57.451786 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:57 crc kubenswrapper[4725]: I0227 06:10:57.451856 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:57 crc kubenswrapper[4725]: I0227 06:10:57.451879 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:57 crc kubenswrapper[4725]: I0227 06:10:57.451922 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:10:57 crc kubenswrapper[4725]: E0227 06:10:57.456766 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:57Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 06:10:58 crc kubenswrapper[4725]: I0227 06:10:58.107873 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:10:58 crc kubenswrapper[4725]: I0227 06:10:58.108105 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:10:58 crc kubenswrapper[4725]: I0227 06:10:58.109622 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:10:58 crc kubenswrapper[4725]: I0227 06:10:58.109687 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:10:58 crc kubenswrapper[4725]: I0227 06:10:58.109711 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:10:58 crc kubenswrapper[4725]: I0227 06:10:58.110620 4725 scope.go:117] "RemoveContainer" containerID="d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8" Feb 27 06:10:58 crc kubenswrapper[4725]: E0227 06:10:58.110972 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:10:58 crc kubenswrapper[4725]: I0227 06:10:58.178526 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:58Z is after 2026-02-23T05:33:13Z Feb 27 06:10:59 crc kubenswrapper[4725]: I0227 06:10:59.176872 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:59Z is after 2026-02-23T05:33:13Z Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.178879 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:00Z is after 2026-02-23T05:33:13Z Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.429161 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.429495 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.434548 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.434643 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.434715 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:00 crc kubenswrapper[4725]: W0227 06:11:00.451170 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:00Z is after 2026-02-23T05:33:13Z Feb 27 06:11:00 crc kubenswrapper[4725]: E0227 06:11:00.451529 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.740109 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.740360 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.741730 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.741787 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:00 crc kubenswrapper[4725]: I0227 06:11:00.741809 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:01 crc kubenswrapper[4725]: I0227 06:11:01.178760 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:01Z is after 2026-02-23T05:33:13Z Feb 27 06:11:02 crc kubenswrapper[4725]: I0227 06:11:02.179780 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:02Z is after 2026-02-23T05:33:13Z Feb 27 06:11:02 crc kubenswrapper[4725]: I0227 06:11:02.222786 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 06:11:02 crc kubenswrapper[4725]: E0227 06:11:02.227324 4725 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:11:02 crc kubenswrapper[4725]: E0227 06:11:02.228625 4725 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 27 06:11:02 crc kubenswrapper[4725]: E0227 06:11:02.386380 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:11:03 crc kubenswrapper[4725]: I0227 06:11:03.178565 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:03Z is after 2026-02-23T05:33:13Z Feb 27 06:11:03 crc kubenswrapper[4725]: I0227 06:11:03.740123 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 06:11:03 crc kubenswrapper[4725]: I0227 06:11:03.740209 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 06:11:04 crc kubenswrapper[4725]: I0227 06:11:04.178744 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:04Z is after 2026-02-23T05:33:13Z Feb 27 06:11:04 crc kubenswrapper[4725]: E0227 06:11:04.451686 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:04Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 06:11:04 crc kubenswrapper[4725]: I0227 06:11:04.457792 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:04 crc kubenswrapper[4725]: I0227 06:11:04.458995 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:04 crc kubenswrapper[4725]: I0227 06:11:04.459040 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:04 crc kubenswrapper[4725]: I0227 06:11:04.459053 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:04 crc kubenswrapper[4725]: I0227 06:11:04.459080 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:11:04 crc kubenswrapper[4725]: E0227 06:11:04.463815 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:04Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 06:11:05 crc kubenswrapper[4725]: I0227 06:11:05.178965 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:05Z is after 2026-02-23T05:33:13Z Feb 27 06:11:06 crc kubenswrapper[4725]: I0227 06:11:06.178738 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:06Z is after 2026-02-23T05:33:13Z Feb 27 06:11:06 crc kubenswrapper[4725]: W0227 06:11:06.654947 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:06Z is after 2026-02-23T05:33:13Z Feb 27 06:11:06 crc kubenswrapper[4725]: E0227 06:11:06.655045 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 06:11:07 crc kubenswrapper[4725]: E0227 06:11:07.064248 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:07Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898059c4911bb22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,LastTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:07 crc kubenswrapper[4725]: I0227 06:11:07.177752 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:07Z is after 2026-02-23T05:33:13Z Feb 27 06:11:08 crc kubenswrapper[4725]: I0227 06:11:08.179421 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:08Z is after 2026-02-23T05:33:13Z Feb 27 06:11:09 crc kubenswrapper[4725]: I0227 06:11:09.179232 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:09Z is after 2026-02-23T05:33:13Z Feb 27 06:11:10 crc kubenswrapper[4725]: I0227 06:11:10.179638 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:10Z is after 2026-02-23T05:33:13Z Feb 27 06:11:11 crc kubenswrapper[4725]: I0227 06:11:11.177444 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:11Z is after 2026-02-23T05:33:13Z Feb 27 06:11:11 crc kubenswrapper[4725]: E0227 06:11:11.457151 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:11Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 06:11:11 crc kubenswrapper[4725]: I0227 06:11:11.464310 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:11 crc kubenswrapper[4725]: I0227 06:11:11.465940 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:11 crc kubenswrapper[4725]: I0227 06:11:11.466026 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:11 crc kubenswrapper[4725]: I0227 06:11:11.466048 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:11 crc kubenswrapper[4725]: I0227 06:11:11.466083 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:11:11 crc kubenswrapper[4725]: E0227 06:11:11.471404 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:11Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 06:11:12 crc kubenswrapper[4725]: I0227 06:11:12.180562 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:12Z is after 2026-02-23T05:33:13Z Feb 27 06:11:12 crc kubenswrapper[4725]: E0227 06:11:12.386657 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.178992 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:11:13Z is after 2026-02-23T05:33:13Z Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.251194 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.252726 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.252775 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.252792 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.253687 4725 scope.go:117] "RemoveContainer" containerID="d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.593018 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.595793 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4"} Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.596018 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.597619 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.597650 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.597659 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.740951 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 06:11:13 crc kubenswrapper[4725]: I0227 06:11:13.741039 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 06:11:14 crc kubenswrapper[4725]: W0227 06:11:14.053694 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 27 06:11:14 crc kubenswrapper[4725]: E0227 06:11:14.053770 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 06:11:14 crc kubenswrapper[4725]: I0227 06:11:14.178906 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:14 crc kubenswrapper[4725]: I0227 06:11:14.465263 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 06:11:14 crc kubenswrapper[4725]: I0227 06:11:14.465926 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:14 crc kubenswrapper[4725]: I0227 06:11:14.466918 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:14 crc kubenswrapper[4725]: I0227 06:11:14.466971 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:14 crc kubenswrapper[4725]: I0227 06:11:14.466988 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.181331 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.602565 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.603779 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.606683 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" exitCode=255 Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.606729 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4"} Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.606803 4725 scope.go:117] "RemoveContainer" containerID="d5c686bdd824f5366d3555076471231d41d82613952d654e7ef1183e6898e8b8" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.607080 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.608730 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.608777 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.608799 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:15 crc kubenswrapper[4725]: I0227 06:11:15.609912 4725 scope.go:117] "RemoveContainer" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" Feb 27 06:11:15 crc kubenswrapper[4725]: E0227 06:11:15.610235 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.182270 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:16 crc kubenswrapper[4725]: W0227 06:11:16.365015 4725 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:16 crc kubenswrapper[4725]: E0227 06:11:16.365096 4725 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.596516 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.612108 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.615273 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.616664 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.616891 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.617072 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:16 crc kubenswrapper[4725]: I0227 06:11:16.618035 4725 scope.go:117] "RemoveContainer" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" Feb 27 06:11:16 crc kubenswrapper[4725]: E0227 06:11:16.619060 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.070966 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4911bb22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,LastTimestamp:2026-02-27 06:10:22.169226018 +0000 UTC m=+0.631846597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.077735 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.084182 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2c691 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,LastTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.090978 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2ed83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,LastTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.097551 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c55bbca1d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.381697565 +0000 UTC m=+0.844318174,LastTimestamp:2026-02-27 06:10:22.381697565 +0000 UTC m=+0.844318174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.106189 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.480985581 +0000 UTC m=+0.943606180,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.113017 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2c691\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2c691 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,LastTimestamp:2026-02-27 06:10:22.481014662 +0000 UTC m=+0.943635271,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.119444 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2ed83\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2ed83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,LastTimestamp:2026-02-27 06:10:22.481034703 +0000 UTC m=+0.943655302,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.125868 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.553721284 +0000 UTC m=+1.016341893,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.132442 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2c691\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2c691 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,LastTimestamp:2026-02-27 06:10:22.553752085 +0000 UTC m=+1.016372684,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.139095 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2ed83\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2ed83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,LastTimestamp:2026-02-27 06:10:22.553769975 +0000 UTC m=+1.016390584,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.146114 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.555105566 +0000 UTC m=+1.017726175,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.152895 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2c691\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2c691 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,LastTimestamp:2026-02-27 06:10:22.555132966 +0000 UTC m=+1.017753575,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.160078 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2ed83\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2ed83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,LastTimestamp:2026-02-27 06:10:22.555147886 +0000 UTC m=+1.017768495,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.171111 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.560419686 +0000 UTC m=+1.023040285,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.178164 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2c691\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2c691 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,LastTimestamp:2026-02-27 06:10:22.560445537 +0000 UTC m=+1.023066136,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: I0227 06:11:17.178352 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.181066 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2ed83\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2ed83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,LastTimestamp:2026-02-27 06:10:22.560465157 +0000 UTC m=+1.023085766,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.184780 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.562609756 +0000 UTC m=+1.025230365,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.190215 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2c691\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2c691 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,LastTimestamp:2026-02-27 06:10:22.562643477 +0000 UTC m=+1.025264086,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.196722 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.562642077 +0000 UTC m=+1.025262686,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.203346 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2ed83\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2ed83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,LastTimestamp:2026-02-27 06:10:22.562666877 +0000 UTC m=+1.025287476,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.209700 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2c691\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2c691 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245840529 +0000 UTC m=+0.708461108,LastTimestamp:2026-02-27 06:10:22.562683338 +0000 UTC m=+1.025303947,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.216209 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2ed83\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2ed83 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245850499 +0000 UTC m=+0.708471078,LastTimestamp:2026-02-27 06:10:22.562703748 +0000 UTC m=+1.025324357,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.222885 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.564851737 +0000 UTC m=+1.027472336,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.229379 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898059c4da2753c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898059c4da2753c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.245819708 +0000 UTC m=+0.708440287,LastTimestamp:2026-02-27 06:10:22.564892598 +0000 UTC m=+1.027513197,Count:9,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.238207 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059c78d7d5a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.970738089 +0000 UTC m=+1.433358698,LastTimestamp:2026-02-27 06:10:22.970738089 +0000 UTC m=+1.433358698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.244605 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898059c78e1bec9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.971387593 +0000 UTC m=+1.434008202,LastTimestamp:2026-02-27 06:10:22.971387593 +0000 UTC m=+1.434008202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.252272 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059c7930d237 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:22.976569911 +0000 UTC m=+1.439190520,LastTimestamp:2026-02-27 06:10:22.976569911 +0000 UTC m=+1.439190520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.259144 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059c7b1b9164 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.008731492 +0000 UTC m=+1.471352121,LastTimestamp:2026-02-27 06:10:23.008731492 +0000 UTC m=+1.471352121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.266965 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059c7c344b16 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.02712911 +0000 UTC m=+1.489749719,LastTimestamp:2026-02-27 06:10:23.02712911 +0000 UTC m=+1.489749719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.275521 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059c9eec8f01 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.609630465 +0000 UTC m=+2.072251044,LastTimestamp:2026-02-27 06:10:23.609630465 +0000 UTC m=+2.072251044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.282368 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059c9fccc1d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.624323539 +0000 UTC m=+2.086944128,LastTimestamp:2026-02-27 06:10:23.624323539 +0000 UTC m=+2.086944128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.289354 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059ca0161dfa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.629131258 +0000 UTC m=+2.091751837,LastTimestamp:2026-02-27 06:10:23.629131258 +0000 UTC m=+2.091751837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.296335 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059ca08fed2e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.637114158 +0000 UTC m=+2.099734747,LastTimestamp:2026-02-27 06:10:23.637114158 +0000 UTC m=+2.099734747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.303565 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059ca0971901 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.637584129 +0000 UTC m=+2.100204708,LastTimestamp:2026-02-27 06:10:23.637584129 +0000 UTC m=+2.100204708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.310097 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059ca0afdde4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.639207396 +0000 UTC m=+2.101827985,LastTimestamp:2026-02-27 06:10:23.639207396 +0000 UTC m=+2.101827985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.317113 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059ca0b09b31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.639255857 +0000 UTC m=+2.101876436,LastTimestamp:2026-02-27 06:10:23.639255857 +0000 UTC m=+2.101876436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.324719 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898059ca1223574 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.646700916 +0000 UTC m=+2.109321495,LastTimestamp:2026-02-27 06:10:23.646700916 +0000 UTC m=+2.109321495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.332606 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059ca15a76ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.65038769 +0000 UTC m=+2.113008269,LastTimestamp:2026-02-27 06:10:23.65038769 +0000 UTC m=+2.113008269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.339861 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059ca16c51c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.651557827 +0000 UTC m=+2.114178406,LastTimestamp:2026-02-27 06:10:23.651557827 +0000 UTC m=+2.114178406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.346885 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898059ca215eed3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.662673619 +0000 UTC m=+2.125294198,LastTimestamp:2026-02-27 06:10:23.662673619 +0000 UTC m=+2.125294198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.353964 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cb18157d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.921371094 +0000 UTC m=+2.383991703,LastTimestamp:2026-02-27 06:10:23.921371094 +0000 UTC m=+2.383991703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.361439 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cb25780b5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.935406261 +0000 UTC m=+2.398026860,LastTimestamp:2026-02-27 06:10:23.935406261 +0000 UTC m=+2.398026860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.369118 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cb2706a6c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.937038956 +0000 UTC m=+2.399659575,LastTimestamp:2026-02-27 06:10:23.937038956 +0000 UTC m=+2.399659575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.377359 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cc0b3b7f8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.176330744 +0000 UTC m=+2.638951343,LastTimestamp:2026-02-27 06:10:24.176330744 +0000 UTC m=+2.638951343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.383354 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cc1c22a4c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.194054732 +0000 UTC m=+2.656675331,LastTimestamp:2026-02-27 06:10:24.194054732 +0000 UTC m=+2.656675331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.390491 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cc1d8eb1c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.195545884 +0000 UTC m=+2.658166483,LastTimestamp:2026-02-27 06:10:24.195545884 +0000 UTC m=+2.658166483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.397789 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059cc65292a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.270627495 +0000 UTC m=+2.733248104,LastTimestamp:2026-02-27 06:10:24.270627495 +0000 UTC m=+2.733248104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.404495 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059cc68d7a00 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.274487808 +0000 UTC m=+2.737108407,LastTimestamp:2026-02-27 06:10:24.274487808 +0000 UTC m=+2.737108407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.411165 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cc7157a87 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.283400839 +0000 UTC m=+2.746021448,LastTimestamp:2026-02-27 06:10:24.283400839 +0000 UTC m=+2.746021448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.419185 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898059cc7bfdd1e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.294567198 +0000 UTC m=+2.757187807,LastTimestamp:2026-02-27 06:10:24.294567198 +0000 UTC m=+2.757187807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.425989 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cd0cd0d71 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.446426481 +0000 UTC m=+2.909047050,LastTimestamp:2026-02-27 06:10:24.446426481 +0000 UTC m=+2.909047050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.433907 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cd19a1d33 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.459865395 +0000 UTC m=+2.922485964,LastTimestamp:2026-02-27 06:10:24.459865395 +0000 UTC m=+2.922485964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.440565 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059cd47fde64 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.508477028 +0000 UTC m=+2.971097617,LastTimestamp:2026-02-27 06:10:24.508477028 +0000 UTC m=+2.971097617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.447047 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898059cd4c87a7c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.51323558 +0000 UTC m=+2.975856149,LastTimestamp:2026-02-27 06:10:24.51323558 +0000 UTC m=+2.975856149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.453453 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cd4c8798c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.51323534 +0000 UTC m=+2.975855909,LastTimestamp:2026-02-27 06:10:24.51323534 +0000 UTC m=+2.975855909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.459605 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059cd4cce05f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.513523807 +0000 UTC m=+2.976144376,LastTimestamp:2026-02-27 06:10:24.513523807 +0000 UTC m=+2.976144376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.467746 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059cd54d59cd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.521943501 +0000 UTC m=+2.984564070,LastTimestamp:2026-02-27 06:10:24.521943501 +0000 UTC m=+2.984564070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.474682 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059cd55cdc7b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.522959995 +0000 UTC m=+2.985580564,LastTimestamp:2026-02-27 06:10:24.522959995 +0000 UTC m=+2.985580564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.481398 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cd596f758 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.52676796 +0000 UTC m=+2.989388529,LastTimestamp:2026-02-27 06:10:24.52676796 +0000 UTC m=+2.989388529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.488163 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cd5a621c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.527761865 +0000 UTC m=+2.990382434,LastTimestamp:2026-02-27 06:10:24.527761865 +0000 UTC m=+2.990382434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.491836 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059cd61bd3f8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.535475192 +0000 UTC m=+2.998095761,LastTimestamp:2026-02-27 06:10:24.535475192 +0000 UTC m=+2.998095761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.495706 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898059cd6217eb9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.535846585 +0000 UTC m=+2.998467154,LastTimestamp:2026-02-27 06:10:24.535846585 +0000 UTC m=+2.998467154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.498465 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059ce00d7a5d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.702306909 +0000 UTC m=+3.164927478,LastTimestamp:2026-02-27 06:10:24.702306909 +0000 UTC m=+3.164927478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.503366 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059ce0bd9434 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.71384786 +0000 UTC m=+3.176468429,LastTimestamp:2026-02-27 06:10:24.71384786 +0000 UTC m=+3.176468429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.509634 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059ce0ccc96d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.714844525 +0000 UTC m=+3.177465124,LastTimestamp:2026-02-27 06:10:24.714844525 +0000 UTC m=+3.177465124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.516457 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059ce0dac381 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.715760513 +0000 UTC m=+3.178381082,LastTimestamp:2026-02-27 06:10:24.715760513 +0000 UTC m=+3.178381082,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.523002 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059ce2b67db8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.746937784 +0000 UTC m=+3.209558353,LastTimestamp:2026-02-27 06:10:24.746937784 +0000 UTC m=+3.209558353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.529407 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059ce2cb6b2b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.748309291 +0000 UTC m=+3.210929860,LastTimestamp:2026-02-27 06:10:24.748309291 +0000 UTC m=+3.210929860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.535855 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059ced47d7ac openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.924235692 +0000 UTC m=+3.386856261,LastTimestamp:2026-02-27 06:10:24.924235692 +0000 UTC m=+3.386856261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.542089 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cede99ae6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.934836966 +0000 UTC m=+3.397457535,LastTimestamp:2026-02-27 06:10:24.934836966 +0000 UTC m=+3.397457535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.549014 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898059cee297f25 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.939024165 +0000 UTC m=+3.401644734,LastTimestamp:2026-02-27 06:10:24.939024165 +0000 UTC m=+3.401644734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.555471 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059ceed0380a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.949950474 +0000 UTC m=+3.412571043,LastTimestamp:2026-02-27 06:10:24.949950474 +0000 UTC m=+3.412571043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.562250 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059ceedc57f6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:24.950745078 +0000 UTC m=+3.413365647,LastTimestamp:2026-02-27 06:10:24.950745078 +0000 UTC m=+3.413365647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.569642 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cfa799f81 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.145601921 +0000 UTC m=+3.608222490,LastTimestamp:2026-02-27 06:10:25.145601921 +0000 UTC m=+3.608222490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.576599 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cfb2f0557 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.157490007 +0000 UTC m=+3.620110576,LastTimestamp:2026-02-27 06:10:25.157490007 +0000 UTC m=+3.620110576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.583014 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cfb404275 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.158619765 +0000 UTC m=+3.621240334,LastTimestamp:2026-02-27 06:10:25.158619765 +0000 UTC m=+3.621240334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.594012 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d04835f62 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.314013026 +0000 UTC m=+3.776633595,LastTimestamp:2026-02-27 06:10:25.314013026 +0000 UTC m=+3.776633595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.600670 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059d070bebcb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.356516299 +0000 UTC m=+3.819136898,LastTimestamp:2026-02-27 06:10:25.356516299 +0000 UTC m=+3.819136898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.607482 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059d07f50081 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.371791489 +0000 UTC m=+3.834412058,LastTimestamp:2026-02-27 06:10:25.371791489 +0000 UTC m=+3.834412058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.614457 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d1088477b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.515661179 +0000 UTC m=+3.978281748,LastTimestamp:2026-02-27 06:10:25.515661179 +0000 UTC m=+3.978281748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.621064 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d114315e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.527903718 +0000 UTC m=+3.990524287,LastTimestamp:2026-02-27 06:10:25.527903718 +0000 UTC m=+3.990524287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.633147 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d40c860d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:26.325168337 +0000 UTC m=+4.787788906,LastTimestamp:2026-02-27 06:10:26.325168337 +0000 UTC m=+4.787788906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.640241 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d4fdd7209 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:26.578207241 +0000 UTC m=+5.040827830,LastTimestamp:2026-02-27 06:10:26.578207241 +0000 UTC m=+5.040827830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.650859 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d509c6bbb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:26.590723003 +0000 UTC m=+5.053343582,LastTimestamp:2026-02-27 06:10:26.590723003 +0000 UTC m=+5.053343582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.657739 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d50ae2de2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:26.591886818 +0000 UTC m=+5.054507427,LastTimestamp:2026-02-27 06:10:26.591886818 +0000 UTC m=+5.054507427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.664879 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d5e9ca7c7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:26.825619399 +0000 UTC m=+5.288239968,LastTimestamp:2026-02-27 06:10:26.825619399 +0000 UTC m=+5.288239968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.671879 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d5fa252b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:26.842768049 +0000 UTC m=+5.305388648,LastTimestamp:2026-02-27 06:10:26.842768049 +0000 UTC m=+5.305388648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.678485 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d5fb9160d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:26.844259853 +0000 UTC m=+5.306880452,LastTimestamp:2026-02-27 06:10:26.844259853 +0000 UTC m=+5.306880452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.684989 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d6e77cf69 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.091640169 +0000 UTC m=+5.554260748,LastTimestamp:2026-02-27 06:10:27.091640169 +0000 UTC m=+5.554260748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.691850 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d6f09e778 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.101214584 +0000 UTC m=+5.563835183,LastTimestamp:2026-02-27 06:10:27.101214584 +0000 UTC m=+5.563835183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.699890 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d6f1b3acc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.102350028 +0000 UTC m=+5.564970597,LastTimestamp:2026-02-27 06:10:27.102350028 +0000 UTC m=+5.564970597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.702728 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d7da6f4e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.346388194 +0000 UTC m=+5.809008803,LastTimestamp:2026-02-27 06:10:27.346388194 +0000 UTC m=+5.809008803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.707411 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d7e9cbcb9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.362495673 +0000 UTC m=+5.825116282,LastTimestamp:2026-02-27 06:10:27.362495673 +0000 UTC m=+5.825116282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.714627 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d7eb56d11 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.364113681 +0000 UTC m=+5.826734280,LastTimestamp:2026-02-27 06:10:27.364113681 +0000 UTC m=+5.826734280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.720867 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d8d1d0000 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.605782528 +0000 UTC m=+6.068403127,LastTimestamp:2026-02-27 06:10:27.605782528 +0000 UTC m=+6.068403127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.727977 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898059d8e05fa3e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:27.621050942 +0000 UTC m=+6.083671551,LastTimestamp:2026-02-27 06:10:27.621050942 +0000 UTC m=+6.083671551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.737856 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 06:11:17 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.1898059efac62c3d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 27 06:11:17 crc kubenswrapper[4725]: body: Feb 27 06:11:17 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:33.740553277 +0000 UTC m=+12.203173886,LastTimestamp:2026-02-27 06:10:33.740553277 +0000 UTC m=+12.203173886,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 06:11:17 crc kubenswrapper[4725]: > Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.744319 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059efac77d7a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:33.74063961 +0000 UTC m=+12.203260209,LastTimestamp:2026-02-27 06:10:33.74063961 +0000 UTC m=+12.203260209,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.751966 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898059cfb404275\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059cfb404275 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.158619765 +0000 UTC m=+3.621240334,LastTimestamp:2026-02-27 06:10:36.374361558 +0000 UTC m=+14.836982127,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.759072 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898059d070bebcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059d070bebcb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.356516299 +0000 UTC m=+3.819136898,LastTimestamp:2026-02-27 06:10:36.607716547 +0000 UTC m=+15.070337126,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.765956 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898059d07f50081\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059d07f50081 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:25.371791489 +0000 UTC m=+3.834412058,LastTimestamp:2026-02-27 06:10:36.618820377 +0000 UTC m=+15.081440966,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.772671 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 06:11:17 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-apiserver-crc.1898059fc073ee27 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 06:11:17 crc kubenswrapper[4725]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 06:11:17 crc kubenswrapper[4725]: Feb 27 06:11:17 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:37.057052199 +0000 UTC m=+15.519672778,LastTimestamp:2026-02-27 06:10:37.057052199 +0000 UTC m=+15.519672778,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 06:11:17 crc kubenswrapper[4725]: > Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.778432 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059fc074c8f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:37.057108211 +0000 UTC m=+15.519728790,LastTimestamp:2026-02-27 06:10:37.057108211 +0000 UTC m=+15.519728790,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.785424 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898059fc073ee27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 06:11:17 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-apiserver-crc.1898059fc073ee27 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 06:11:17 crc kubenswrapper[4725]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 06:11:17 crc kubenswrapper[4725]: Feb 27 06:11:17 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:37.057052199 +0000 UTC m=+15.519672778,LastTimestamp:2026-02-27 06:10:37.065079648 +0000 UTC m=+15.527700227,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 06:11:17 crc kubenswrapper[4725]: > Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.791054 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898059fc074c8f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898059fc074c8f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:37.057108211 +0000 UTC m=+15.519728790,LastTimestamp:2026-02-27 06:10:37.065132339 +0000 UTC m=+15.527752928,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.798650 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 06:11:17 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189805a14ed78472 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 06:11:17 crc kubenswrapper[4725]: body: Feb 27 06:11:17 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:43.740910706 +0000 UTC m=+22.203531305,LastTimestamp:2026-02-27 06:10:43.740910706 +0000 UTC m=+22.203531305,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 06:11:17 crc kubenswrapper[4725]: > Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.805093 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189805a14ed87078 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:43.740971128 +0000 UTC m=+22.203591727,LastTimestamp:2026-02-27 06:10:43.740971128 +0000 UTC m=+22.203591727,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.809575 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189805a14ed78472\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 06:11:17 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189805a14ed78472 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 06:11:17 crc kubenswrapper[4725]: body: Feb 27 06:11:17 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:43.740910706 +0000 UTC m=+22.203531305,LastTimestamp:2026-02-27 06:10:53.740851773 +0000 UTC m=+32.203472382,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 06:11:17 crc kubenswrapper[4725]: > Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.814521 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189805a14ed87078\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189805a14ed87078 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:43.740971128 +0000 UTC m=+22.203591727,LastTimestamp:2026-02-27 06:10:53.740919205 +0000 UTC m=+32.203539804,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.822699 4725 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189805a3a3115ad2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:53.743921874 +0000 UTC m=+32.206542473,LastTimestamp:2026-02-27 06:10:53.743921874 +0000 UTC m=+32.206542473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.830485 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898059ca0afdde4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059ca0afdde4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.639207396 +0000 UTC m=+2.101827985,LastTimestamp:2026-02-27 06:10:53.870474518 +0000 UTC m=+32.333095127,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.838451 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898059cb18157d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cb18157d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.921371094 +0000 UTC m=+2.383991703,LastTimestamp:2026-02-27 06:10:54.088509072 +0000 UTC m=+32.551129691,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.845382 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898059cb25780b5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898059cb25780b5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:23.935406261 +0000 UTC m=+2.398026860,LastTimestamp:2026-02-27 06:10:54.106264419 +0000 UTC m=+32.568884998,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.856717 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189805a14ed78472\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 06:11:17 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189805a14ed78472 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 06:11:17 crc kubenswrapper[4725]: body: Feb 27 06:11:17 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:43.740910706 +0000 UTC m=+22.203531305,LastTimestamp:2026-02-27 06:11:03.740182081 +0000 UTC m=+42.202802680,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 06:11:17 crc kubenswrapper[4725]: > Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.865564 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189805a14ed87078\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189805a14ed87078 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:43.740971128 +0000 UTC m=+22.203591727,LastTimestamp:2026-02-27 06:11:03.740245363 +0000 UTC m=+42.202865962,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:11:17 crc kubenswrapper[4725]: E0227 06:11:17.875633 4725 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189805a14ed78472\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 06:11:17 crc kubenswrapper[4725]: &Event{ObjectMeta:{kube-controller-manager-crc.189805a14ed78472 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 06:11:17 crc kubenswrapper[4725]: body: Feb 27 06:11:17 crc kubenswrapper[4725]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:10:43.740910706 +0000 UTC m=+22.203531305,LastTimestamp:2026-02-27 06:11:13.741014704 +0000 UTC m=+52.203635303,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 06:11:17 crc kubenswrapper[4725]: > Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.108469 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.108832 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.110810 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.111097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.111234 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.112480 4725 scope.go:117] "RemoveContainer" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" Feb 27 06:11:18 crc kubenswrapper[4725]: E0227 06:11:18.113087 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.181051 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:18 crc kubenswrapper[4725]: E0227 06:11:18.466826 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.471855 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.473570 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.473763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.473919 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:18 crc kubenswrapper[4725]: I0227 06:11:18.474093 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:11:18 crc kubenswrapper[4725]: E0227 06:11:18.481249 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 06:11:19 crc kubenswrapper[4725]: I0227 06:11:19.180892 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:20 crc kubenswrapper[4725]: I0227 06:11:20.180919 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:21 crc kubenswrapper[4725]: I0227 06:11:21.178229 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:22 crc kubenswrapper[4725]: I0227 06:11:22.182045 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:22 crc kubenswrapper[4725]: E0227 06:11:22.388371 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:11:22 crc kubenswrapper[4725]: I0227 06:11:22.689127 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:11:22 crc kubenswrapper[4725]: I0227 06:11:22.689525 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:22 crc kubenswrapper[4725]: I0227 06:11:22.691337 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:22 crc kubenswrapper[4725]: I0227 06:11:22.691399 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:22 crc kubenswrapper[4725]: I0227 06:11:22.691421 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:22 crc kubenswrapper[4725]: I0227 06:11:22.695923 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:11:23 crc kubenswrapper[4725]: I0227 06:11:23.182150 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:23 crc kubenswrapper[4725]: I0227 06:11:23.638096 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:23 crc kubenswrapper[4725]: I0227 06:11:23.639612 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:23 crc kubenswrapper[4725]: I0227 06:11:23.639763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:23 crc kubenswrapper[4725]: I0227 06:11:23.639881 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:24 crc kubenswrapper[4725]: I0227 06:11:24.180685 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:25 crc kubenswrapper[4725]: I0227 06:11:25.180808 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:25 crc kubenswrapper[4725]: E0227 06:11:25.472771 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 06:11:25 crc kubenswrapper[4725]: I0227 06:11:25.481829 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:25 crc kubenswrapper[4725]: I0227 06:11:25.483680 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:25 crc kubenswrapper[4725]: I0227 06:11:25.483863 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:25 crc kubenswrapper[4725]: I0227 06:11:25.483999 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:25 crc kubenswrapper[4725]: I0227 06:11:25.484154 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:11:25 crc kubenswrapper[4725]: E0227 06:11:25.489877 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 06:11:26 crc kubenswrapper[4725]: I0227 06:11:26.181111 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:27 crc kubenswrapper[4725]: I0227 06:11:27.181199 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:28 crc kubenswrapper[4725]: I0227 06:11:28.178072 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:29 crc kubenswrapper[4725]: I0227 06:11:29.177630 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:30 crc kubenswrapper[4725]: I0227 06:11:30.177826 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:31 crc kubenswrapper[4725]: I0227 06:11:31.180108 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.180816 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.251532 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.253176 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.253206 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.253215 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.253658 4725 scope.go:117] "RemoveContainer" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" Feb 27 06:11:32 crc kubenswrapper[4725]: E0227 06:11:32.253820 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:11:32 crc kubenswrapper[4725]: E0227 06:11:32.389027 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:11:32 crc kubenswrapper[4725]: E0227 06:11:32.482279 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.490674 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.492144 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.492207 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.492225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:32 crc kubenswrapper[4725]: I0227 06:11:32.492260 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:11:32 crc kubenswrapper[4725]: E0227 06:11:32.498555 4725 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 06:11:33 crc kubenswrapper[4725]: I0227 06:11:33.182364 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:34 crc kubenswrapper[4725]: I0227 06:11:34.180410 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:34 crc kubenswrapper[4725]: I0227 06:11:34.230361 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 06:11:34 crc kubenswrapper[4725]: I0227 06:11:34.250444 4725 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 06:11:35 crc kubenswrapper[4725]: I0227 06:11:35.181007 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:36 crc kubenswrapper[4725]: I0227 06:11:36.181775 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:37 crc kubenswrapper[4725]: I0227 06:11:37.180843 4725 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 06:11:37 crc kubenswrapper[4725]: I0227 06:11:37.407645 4725 csr.go:261] certificate signing request csr-cj9hs is approved, waiting to be issued Feb 27 06:11:37 crc kubenswrapper[4725]: I0227 06:11:37.419184 4725 csr.go:257] certificate signing request csr-cj9hs is issued Feb 27 06:11:37 crc kubenswrapper[4725]: I0227 06:11:37.487763 4725 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 27 06:11:38 crc kubenswrapper[4725]: I0227 06:11:38.017648 4725 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 27 06:11:38 crc kubenswrapper[4725]: I0227 06:11:38.421054 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-15 13:33:34.393024762 +0000 UTC Feb 27 06:11:38 crc kubenswrapper[4725]: I0227 06:11:38.421112 4725 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6991h21m55.971918533s for next certificate rotation Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.499508 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.501330 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.501377 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.501396 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.501534 4725 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.511650 4725 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.512240 4725 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.512482 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.517085 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.517126 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.517145 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.517165 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.517182 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:39Z","lastTransitionTime":"2026-02-27T06:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.536770 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.547044 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.547102 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.547124 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.547152 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.547175 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:39Z","lastTransitionTime":"2026-02-27T06:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.568959 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.580901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.581126 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.581323 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.581526 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.581894 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:39Z","lastTransitionTime":"2026-02-27T06:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.597457 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.609014 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.609224 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.609428 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.609566 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:39 crc kubenswrapper[4725]: I0227 06:11:39.609729 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:39Z","lastTransitionTime":"2026-02-27T06:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.626559 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.626787 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.626830 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.726978 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.827328 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:39 crc kubenswrapper[4725]: E0227 06:11:39.928420 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.029462 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.130007 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.230951 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.331772 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.432811 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.533143 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.633585 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.734458 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.835200 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:40 crc kubenswrapper[4725]: E0227 06:11:40.935560 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.036019 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.137070 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.238071 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.338600 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.439161 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.539595 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.640100 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.740385 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.840649 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:41 crc kubenswrapper[4725]: E0227 06:11:41.941765 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.042553 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.142950 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.243086 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.343739 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.389701 4725 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.444388 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.544594 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.645651 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.746209 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.848512 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:42 crc kubenswrapper[4725]: E0227 06:11:42.950001 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.050330 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.151480 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.251931 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.352455 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.452981 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.553794 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.654113 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.755390 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.856021 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:43 crc kubenswrapper[4725]: E0227 06:11:43.956601 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.057546 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.157923 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.258615 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.358781 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: I0227 06:11:44.380590 4725 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.459078 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.559694 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.660104 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.760489 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.861234 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:44 crc kubenswrapper[4725]: E0227 06:11:44.962401 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.062872 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.163417 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: I0227 06:11:45.251557 4725 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 06:11:45 crc kubenswrapper[4725]: I0227 06:11:45.253027 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:45 crc kubenswrapper[4725]: I0227 06:11:45.253087 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:45 crc kubenswrapper[4725]: I0227 06:11:45.253104 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:45 crc kubenswrapper[4725]: I0227 06:11:45.253877 4725 scope.go:117] "RemoveContainer" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.254093 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.264485 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.365310 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.465675 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.565908 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.666335 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.766898 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.867064 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:45 crc kubenswrapper[4725]: E0227 06:11:45.967458 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.068398 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.168566 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.268921 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.369100 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.469826 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.570930 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.671896 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.773086 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.874908 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:46 crc kubenswrapper[4725]: E0227 06:11:46.975485 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.076418 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.176582 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.276866 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.377945 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.479040 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.580210 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.680838 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.781505 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.881766 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:47 crc kubenswrapper[4725]: E0227 06:11:47.981954 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: E0227 06:11:48.083011 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: E0227 06:11:48.183260 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: E0227 06:11:48.283708 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: E0227 06:11:48.384120 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: E0227 06:11:48.485098 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: E0227 06:11:48.585748 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: E0227 06:11:48.686193 4725 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.742341 4725 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.788192 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.788262 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.788277 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.788327 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.788342 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:48Z","lastTransitionTime":"2026-02-27T06:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.891785 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.891834 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.891848 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.891868 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.891881 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:48Z","lastTransitionTime":"2026-02-27T06:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.994188 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.994250 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.994271 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.994339 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:48 crc kubenswrapper[4725]: I0227 06:11:48.994360 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:48Z","lastTransitionTime":"2026-02-27T06:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.096915 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.096999 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.097024 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.097056 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.097079 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.195607 4725 apiserver.go:52] "Watching apiserver" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.200361 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.200409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.200428 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.200451 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.200469 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.202375 4725 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.203138 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.203622 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.203760 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.203816 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.204077 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.204189 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.204248 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.209768 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.210859 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.212644 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.212683 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.214449 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.215459 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.215690 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.215708 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.216005 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.217408 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.219170 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.223020 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.260170 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.278088 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.287901 4725 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.296882 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.303763 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.303826 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.303851 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.303881 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.303902 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.313886 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.331614 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.347436 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359171 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359243 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359282 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359347 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359413 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359446 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359478 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359514 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359546 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359576 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359608 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359639 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359669 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359746 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359699 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359887 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359911 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359928 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359959 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359976 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.359994 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360010 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360028 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360046 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360064 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360086 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360062 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360103 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360101 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360220 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360322 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360387 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360441 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360489 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360536 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360602 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360635 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360642 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360688 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360690 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360734 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360786 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360836 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.360886 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361202 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361265 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361355 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361409 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361458 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361509 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361559 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361605 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361651 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361702 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361753 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361847 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.362026 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.361806 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.362150 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.362203 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.362248 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364600 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364663 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364716 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364767 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364816 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364864 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364914 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364963 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365018 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365067 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365115 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365167 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365225 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365462 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.367839 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.367906 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.367958 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368013 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368061 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368109 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368159 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368215 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368261 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368345 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368400 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368451 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368497 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368551 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368601 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368650 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368700 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368750 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368801 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368852 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368900 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.368950 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.369003 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.369050 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.370790 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.370942 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371037 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371103 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371159 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371515 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371581 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371638 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371692 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371747 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371513 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371800 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371855 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371911 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371965 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372020 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372074 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372125 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372174 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372218 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372266 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372344 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372395 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372451 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372503 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372566 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372627 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372682 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372735 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372788 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372845 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372916 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372965 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373013 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373068 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373117 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373165 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373221 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373276 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373366 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373420 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373470 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373525 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373585 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373638 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373693 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373749 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373806 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373860 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373908 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373959 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374009 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374510 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374579 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374752 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374808 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377319 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377364 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377412 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377454 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377494 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377530 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377568 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377606 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377640 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377674 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377709 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377743 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377780 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377819 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377858 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377894 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377929 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377967 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378007 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378044 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378078 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378114 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378151 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378189 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378224 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378262 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378339 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378378 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.362962 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.380115 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.363036 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.363392 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.363474 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.363894 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364017 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.381106 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364050 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.363343 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364699 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364705 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.364801 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365171 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.365173 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.366388 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.366467 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.366133 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.366750 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.366721 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.369732 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.369681 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.369690 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.369792 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.370168 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.370392 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.370413 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.370625 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.370725 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371448 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371502 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371371 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371654 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.371779 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372401 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.372317 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373184 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373200 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373492 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373533 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373560 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373591 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.373641 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374137 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374224 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374801 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.374959 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.375047 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.375140 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.375460 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.375522 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.375861 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.375912 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.375999 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.376044 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.376473 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.376589 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.376599 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.376635 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.376789 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377076 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377322 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377407 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.377754 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378172 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378233 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378373 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.378447 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:11:49.87839929 +0000 UTC m=+88.341019969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.378902 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.379086 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.379097 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.380428 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.380619 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.380928 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.381964 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382017 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382055 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382158 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382196 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382232 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382270 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382350 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382374 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382391 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382523 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382585 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382590 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382636 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382689 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382742 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382965 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.383032 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.382788 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.383083 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.383179 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.383181 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.383682 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.384226 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.384882 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.385580 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.385620 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.385662 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.385928 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386103 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386340 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386208 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386549 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386577 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388109 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386639 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386654 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386660 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.386716 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.387218 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.387533 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.387541 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.387980 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.385821 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388447 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388503 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388582 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388648 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388710 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388750 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388791 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388830 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388872 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388910 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388945 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.388982 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389018 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389061 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389117 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389168 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389280 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389364 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389384 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389404 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389422 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389442 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389469 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389501 4725 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389526 4725 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389751 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.389841 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389875 4725 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389893 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389912 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.389963 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.390054 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:49.890029772 +0000 UTC m=+88.352650381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390420 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390454 4725 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390489 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390491 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390515 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390582 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390620 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390655 4725 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390684 4725 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390714 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390744 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390890 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390924 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390953 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.390985 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391051 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391081 4725 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391109 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391067 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391136 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391165 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391193 4725 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391199 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391219 4725 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391244 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391246 4725 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391310 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391330 4725 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391347 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391361 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391377 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391395 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391408 4725 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391423 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391436 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391454 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391474 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391488 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391502 4725 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391516 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391530 4725 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391622 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391637 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391651 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391663 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391677 4725 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391690 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391702 4725 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391716 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391729 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391742 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391756 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391769 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391785 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391797 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391810 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391822 4725 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391835 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391847 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391860 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391873 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391885 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391898 4725 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391910 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391922 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391934 4725 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391947 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391961 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391974 4725 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391986 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391999 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.392011 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.392024 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391392 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.391947 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.392024 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.392342 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.392352 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.393390 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.393649 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.393886 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.394332 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.394507 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.395386 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.395573 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.395781 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.395987 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.396405 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.396619 4725 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.397052 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.397784 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.397959 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.397973 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.398240 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.398515 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.398669 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.399324 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.399831 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.399921 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.400145 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.400883 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.400976 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.401011 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.401204 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.401227 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.401707 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.401843 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402172 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402328 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402410 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402596 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402663 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402742 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402774 4725 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.402907 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:49.90287872 +0000 UTC m=+88.365499319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403149 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403184 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403191 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403209 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403273 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403334 4725 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403360 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403382 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403403 4725 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403392 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.403424 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.402484 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.404080 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.404097 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.404397 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.405485 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.405614 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.406306 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.407473 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.407516 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.407548 4725 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.407573 4725 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.407599 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.407620 4725 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408584 4725 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408623 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408649 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408745 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408767 4725 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408795 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408817 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408839 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.408860 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.409047 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.409424 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.409482 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.409497 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.409521 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.409535 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.409997 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.410010 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.410065 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.410206 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.410805 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.412367 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.412681 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.414094 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.414124 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.414166 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.414197 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.414342 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:49.914272515 +0000 UTC m=+88.376893114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.414544 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.414568 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.415744 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.421781 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.421979 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.422508 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.422684 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:49.922656222 +0000 UTC m=+88.385276801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.422469 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.424038 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.424734 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.424700 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.425105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.425571 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.425720 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.426973 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.426970 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.427237 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.427512 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.427627 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.427808 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.428390 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.428642 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.435016 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.435331 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.436452 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.437148 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.437618 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.437746 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.437851 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.438906 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.440160 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.441039 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.441315 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.441469 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.444748 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.453430 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.472614 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.477389 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510342 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510437 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510478 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510646 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510678 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510723 4725 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510751 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510778 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510805 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510835 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510862 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510889 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510916 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510941 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510966 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.510998 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511023 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511053 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511114 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511213 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511247 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511277 4725 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511343 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511371 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511399 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511429 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511456 4725 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511484 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511510 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511535 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511559 4725 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511583 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511609 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511633 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511659 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511685 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511711 4725 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511737 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511762 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511788 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511815 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511841 4725 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511868 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511894 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511919 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511945 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511971 4725 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.511996 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512023 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512055 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512085 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512111 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512137 4725 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512169 4725 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512194 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512219 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512443 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512479 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512507 4725 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512534 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512561 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512585 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512611 4725 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512638 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512663 4725 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512689 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512718 4725 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512744 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512770 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512797 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512823 4725 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512848 4725 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512878 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512904 4725 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512931 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512957 4725 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.512984 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513010 4725 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513036 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513062 4725 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513089 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513119 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513145 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513169 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513194 4725 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513221 4725 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513247 4725 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513274 4725 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513351 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.513383 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.514585 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.514621 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.514636 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.514658 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.514673 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.525968 4725 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.538443 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.553726 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 06:11:49 crc kubenswrapper[4725]: W0227 06:11:49.559868 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-03d7f20d38d5b2fe1777ff51cd9d27ac9008afb0c6a8b33e6f826c8a17bb7b76 WatchSource:0}: Error finding container 03d7f20d38d5b2fe1777ff51cd9d27ac9008afb0c6a8b33e6f826c8a17bb7b76: Status 404 returned error can't find the container with id 03d7f20d38d5b2fe1777ff51cd9d27ac9008afb0c6a8b33e6f826c8a17bb7b76 Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.563818 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.568590 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:11:49 crc kubenswrapper[4725]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 27 06:11:49 crc kubenswrapper[4725]: set -o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 27 06:11:49 crc kubenswrapper[4725]: source /etc/kubernetes/apiserver-url.env Feb 27 06:11:49 crc kubenswrapper[4725]: else Feb 27 06:11:49 crc kubenswrapper[4725]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 27 06:11:49 crc kubenswrapper[4725]: exit 1 Feb 27 06:11:49 crc kubenswrapper[4725]: fi Feb 27 06:11:49 crc kubenswrapper[4725]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 27 06:11:49 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 06:11:49 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.570531 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.575843 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:11:49 crc kubenswrapper[4725]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 06:11:49 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 27 06:11:49 crc kubenswrapper[4725]: set -o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: source "/env/_master" Feb 27 06:11:49 crc kubenswrapper[4725]: set +o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: fi Feb 27 06:11:49 crc kubenswrapper[4725]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 27 06:11:49 crc kubenswrapper[4725]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 27 06:11:49 crc kubenswrapper[4725]: ho_enable="--enable-hybrid-overlay" Feb 27 06:11:49 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 27 06:11:49 crc kubenswrapper[4725]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 27 06:11:49 crc kubenswrapper[4725]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 27 06:11:49 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 06:11:49 crc kubenswrapper[4725]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --webhook-host=127.0.0.1 \ Feb 27 06:11:49 crc kubenswrapper[4725]: --webhook-port=9743 \ Feb 27 06:11:49 crc kubenswrapper[4725]: ${ho_enable} \ Feb 27 06:11:49 crc kubenswrapper[4725]: --enable-interconnect \ Feb 27 06:11:49 crc kubenswrapper[4725]: --disable-approver \ Feb 27 06:11:49 crc kubenswrapper[4725]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --wait-for-kubernetes-api=200s \ Feb 27 06:11:49 crc kubenswrapper[4725]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 27 06:11:49 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 06:11:49 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.581900 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:11:49 crc kubenswrapper[4725]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 06:11:49 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 27 06:11:49 crc kubenswrapper[4725]: set -o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: source "/env/_master" Feb 27 06:11:49 crc kubenswrapper[4725]: set +o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: fi Feb 27 06:11:49 crc kubenswrapper[4725]: Feb 27 06:11:49 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 27 06:11:49 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 06:11:49 crc kubenswrapper[4725]: --disable-webhook \ Feb 27 06:11:49 crc kubenswrapper[4725]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 27 06:11:49 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 06:11:49 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: W0227 06:11:49.581949 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b78d9771a233b85907fecde7799d2bd7c012f77d79ff8274896397a9192a5d95 WatchSource:0}: Error finding container b78d9771a233b85907fecde7799d2bd7c012f77d79ff8274896397a9192a5d95: Status 404 returned error can't find the container with id b78d9771a233b85907fecde7799d2bd7c012f77d79ff8274896397a9192a5d95 Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.583129 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.586204 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.587540 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.617160 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.617207 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.617225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.617248 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.617265 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.697816 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.697877 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.697894 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.697919 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.697936 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.713920 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b78d9771a233b85907fecde7799d2bd7c012f77d79ff8274896397a9192a5d95"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.716043 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ca65092ac378235c1af9d137934776da3f07f36a0ce76545e64419032dd5f98f"} Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.716850 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.717680 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.717995 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.718799 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"03d7f20d38d5b2fe1777ff51cd9d27ac9008afb0c6a8b33e6f826c8a17bb7b76"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.725391 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.725436 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.725454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.725477 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.725495 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.726149 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:11:49 crc kubenswrapper[4725]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 06:11:49 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 27 06:11:49 crc kubenswrapper[4725]: set -o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: source "/env/_master" Feb 27 06:11:49 crc kubenswrapper[4725]: set +o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: fi Feb 27 06:11:49 crc kubenswrapper[4725]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 27 06:11:49 crc kubenswrapper[4725]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 27 06:11:49 crc kubenswrapper[4725]: ho_enable="--enable-hybrid-overlay" Feb 27 06:11:49 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 27 06:11:49 crc kubenswrapper[4725]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 27 06:11:49 crc kubenswrapper[4725]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 27 06:11:49 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 06:11:49 crc kubenswrapper[4725]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --webhook-host=127.0.0.1 \ Feb 27 06:11:49 crc kubenswrapper[4725]: --webhook-port=9743 \ Feb 27 06:11:49 crc kubenswrapper[4725]: ${ho_enable} \ Feb 27 06:11:49 crc kubenswrapper[4725]: --enable-interconnect \ Feb 27 06:11:49 crc kubenswrapper[4725]: --disable-approver \ Feb 27 06:11:49 crc kubenswrapper[4725]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --wait-for-kubernetes-api=200s \ Feb 27 06:11:49 crc kubenswrapper[4725]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 27 06:11:49 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 06:11:49 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.726496 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:11:49 crc kubenswrapper[4725]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 27 06:11:49 crc kubenswrapper[4725]: set -o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 27 06:11:49 crc kubenswrapper[4725]: source /etc/kubernetes/apiserver-url.env Feb 27 06:11:49 crc kubenswrapper[4725]: else Feb 27 06:11:49 crc kubenswrapper[4725]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 27 06:11:49 crc kubenswrapper[4725]: exit 1 Feb 27 06:11:49 crc kubenswrapper[4725]: fi Feb 27 06:11:49 crc kubenswrapper[4725]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 27 06:11:49 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 06:11:49 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.727625 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.729803 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:11:49 crc kubenswrapper[4725]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 06:11:49 crc kubenswrapper[4725]: if [[ -f "/env/_master" ]]; then Feb 27 06:11:49 crc kubenswrapper[4725]: set -o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: source "/env/_master" Feb 27 06:11:49 crc kubenswrapper[4725]: set +o allexport Feb 27 06:11:49 crc kubenswrapper[4725]: fi Feb 27 06:11:49 crc kubenswrapper[4725]: Feb 27 06:11:49 crc kubenswrapper[4725]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 27 06:11:49 crc kubenswrapper[4725]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 06:11:49 crc kubenswrapper[4725]: --disable-webhook \ Feb 27 06:11:49 crc kubenswrapper[4725]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 27 06:11:49 crc kubenswrapper[4725]: --loglevel="${LOGLEVEL}" Feb 27 06:11:49 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 06:11:49 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.731038 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.733625 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.740937 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.745643 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.745696 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.745712 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.745734 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.745750 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.750161 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.761143 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.763596 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.765681 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.765747 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.765766 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.765796 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.765818 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.778412 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.779976 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.785116 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.785185 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.785203 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.785230 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.785248 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.795626 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.800993 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.801338 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.804457 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.804517 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.804536 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.804563 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.804582 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.807431 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.823147 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.835497 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.847280 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.861012 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.872880 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.888095 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.907676 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.907737 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.907757 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.907784 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.907810 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:49Z","lastTransitionTime":"2026-02-27T06:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.916511 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.916657 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.916717 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:11:50.916672296 +0000 UTC m=+89.379292915 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.916784 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.916809 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: I0227 06:11:49.916893 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.916980 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:50.916956675 +0000 UTC m=+89.379577374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.917156 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.917198 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.917220 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.917174 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.917377 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:50.917353006 +0000 UTC m=+89.379973605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:49 crc kubenswrapper[4725]: E0227 06:11:49.917580 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:50.917552212 +0000 UTC m=+89.380172821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.010445 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.010509 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.010533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.010561 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.010579 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.018220 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.018477 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.018526 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.018547 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.018628 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:51.018602483 +0000 UTC m=+89.481223092 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.113311 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.113383 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.113401 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.113426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.113445 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.216479 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.216536 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.216556 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.216581 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.216604 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.257133 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.258497 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.260710 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.261951 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.263938 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.265007 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.266170 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.268402 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.269725 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.271694 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.272882 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.276990 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.277971 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.280001 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.281191 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.283086 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.284512 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.285455 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.287492 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.288743 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.289851 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.292254 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.293216 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.295524 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.296426 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.298516 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.299879 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.302012 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.303170 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.304236 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.305980 4725 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.306181 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.309224 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.310386 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.310937 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.312946 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.314393 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.315183 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.316696 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.317806 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.318959 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.319554 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.319586 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.319598 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.319614 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.319625 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.320536 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.321788 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.322588 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.323800 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.324567 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.325660 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.326612 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.327877 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.328643 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.329823 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.330568 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.331271 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.332672 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.422944 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.422989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.423005 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.423031 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.423049 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.525282 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.525370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.525388 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.525413 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.525430 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.628240 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.628281 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.628344 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.628372 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.628389 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.730106 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.730164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.730185 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.730207 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.730224 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.833400 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.833448 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.833466 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.833487 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.833504 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.926609 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.926702 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.926758 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.926843 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927019 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927054 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927071 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927141 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:52.927114646 +0000 UTC m=+91.389735255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927713 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:11:52.927690093 +0000 UTC m=+91.390310692 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927795 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927885 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:52.927866198 +0000 UTC m=+91.390486807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.927985 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: E0227 06:11:50.928047 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:52.928028433 +0000 UTC m=+91.390649042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.937742 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.937983 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.938025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.938048 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:50 crc kubenswrapper[4725]: I0227 06:11:50.938064 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:50Z","lastTransitionTime":"2026-02-27T06:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.027833 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:51 crc kubenswrapper[4725]: E0227 06:11:51.028063 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:51 crc kubenswrapper[4725]: E0227 06:11:51.028090 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:51 crc kubenswrapper[4725]: E0227 06:11:51.028109 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:51 crc kubenswrapper[4725]: E0227 06:11:51.028171 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:53.028150727 +0000 UTC m=+91.490771336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.040421 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.040469 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.040485 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.040507 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.040525 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.143222 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.143326 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.143347 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.143373 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.143391 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.246765 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.246814 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.246824 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.246841 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.246853 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.250918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:51 crc kubenswrapper[4725]: E0227 06:11:51.251104 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.251209 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:51 crc kubenswrapper[4725]: E0227 06:11:51.251355 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.250865 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:51 crc kubenswrapper[4725]: E0227 06:11:51.251446 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.349620 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.349693 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.349713 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.349739 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.349757 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.452358 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.452422 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.452441 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.452467 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.452486 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.555531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.555594 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.555632 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.555663 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.555683 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.659368 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.659493 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.659504 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.659519 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.659532 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.762751 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.762805 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.762826 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.762858 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.762880 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.831921 4725 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.865072 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.865127 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.865147 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.865171 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.865188 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.968245 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.968384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.968406 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.968433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:51 crc kubenswrapper[4725]: I0227 06:11:51.968451 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:51Z","lastTransitionTime":"2026-02-27T06:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.071885 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.071938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.071957 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.071980 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.071996 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.174405 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.174491 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.174513 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.174539 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.174559 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.263237 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.268576 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.277782 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.277845 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.277863 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.277887 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.277907 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.283841 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.298987 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.313216 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.326130 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.340159 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.386317 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.386350 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.386358 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.386371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.386381 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.489145 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.489218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.489235 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.489261 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.489279 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.594443 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.594499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.594516 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.594536 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.594556 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.697535 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.697575 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.697591 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.697611 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.697629 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.800268 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.800367 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.800388 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.800413 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.800434 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.903668 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.903738 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.903759 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.903783 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.903799 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:52Z","lastTransitionTime":"2026-02-27T06:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.945276 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945419 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:11:56.945391479 +0000 UTC m=+95.408012078 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.945486 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.945532 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:52 crc kubenswrapper[4725]: I0227 06:11:52.945571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945665 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945698 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945726 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:56.945714249 +0000 UTC m=+95.408334818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945750 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:56.94573704 +0000 UTC m=+95.408357639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945797 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945854 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945877 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:52 crc kubenswrapper[4725]: E0227 06:11:52.945975 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:56.945949216 +0000 UTC m=+95.408569825 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.006164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.006193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.006202 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.006213 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.006222 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.046120 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:53 crc kubenswrapper[4725]: E0227 06:11:53.046347 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:53 crc kubenswrapper[4725]: E0227 06:11:53.046382 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:53 crc kubenswrapper[4725]: E0227 06:11:53.046400 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:53 crc kubenswrapper[4725]: E0227 06:11:53.046474 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:11:57.046453171 +0000 UTC m=+95.509073770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.109989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.110055 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.110098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.110125 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.110144 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.213806 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.213866 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.213885 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.213909 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.213928 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.251191 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.251192 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.251392 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:53 crc kubenswrapper[4725]: E0227 06:11:53.251603 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:11:53 crc kubenswrapper[4725]: E0227 06:11:53.251701 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:11:53 crc kubenswrapper[4725]: E0227 06:11:53.251799 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.317345 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.317423 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.317440 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.317468 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.317486 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.420687 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.420768 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.420790 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.420818 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.420836 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.524108 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.524172 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.524193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.524219 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.524237 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.628156 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.628235 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.628260 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.628325 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.628354 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.729993 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.730033 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.730043 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.730060 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.730075 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.833267 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.833365 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.833376 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.833412 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.833423 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.936204 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.936245 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.936256 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.936273 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:53 crc kubenswrapper[4725]: I0227 06:11:53.936303 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:53Z","lastTransitionTime":"2026-02-27T06:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.039174 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.039242 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.039256 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.039277 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.039318 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.142206 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.142268 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.142314 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.142338 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.142356 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.244479 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.244579 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.244597 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.244621 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.244639 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.348260 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.348364 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.348384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.348409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.348428 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.451470 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.451549 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.451572 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.451599 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.451616 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.554738 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.554799 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.554816 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.554838 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.554856 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.657077 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.657171 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.657190 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.657213 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.657229 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.759932 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.759975 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.759984 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.759997 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.760006 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.861682 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.861766 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.861785 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.861811 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.861858 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.964360 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.964444 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.964469 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.964502 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:54 crc kubenswrapper[4725]: I0227 06:11:54.964522 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:54Z","lastTransitionTime":"2026-02-27T06:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.066610 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.066658 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.066669 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.066687 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.066700 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.169775 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.169822 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.169833 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.169850 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.169864 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.251145 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.251223 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:55 crc kubenswrapper[4725]: E0227 06:11:55.251357 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.251405 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:55 crc kubenswrapper[4725]: E0227 06:11:55.251583 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:11:55 crc kubenswrapper[4725]: E0227 06:11:55.251663 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.272652 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.272724 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.272750 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.272781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.272802 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.375696 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.375749 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.375765 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.375786 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.375803 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.479010 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.479090 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.479114 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.479144 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.479164 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.582356 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.582428 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.582446 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.582471 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.582496 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.685970 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.686050 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.686070 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.686103 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.686128 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.789510 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.789554 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.789564 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.789578 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.789588 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.892626 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.892665 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.892673 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.892690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.892701 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.996441 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.996515 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.996534 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.996564 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:55 crc kubenswrapper[4725]: I0227 06:11:55.996585 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:55Z","lastTransitionTime":"2026-02-27T06:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.099956 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.099997 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.100009 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.100025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.100034 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.202964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.203022 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.203039 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.203064 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.203081 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.263735 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.264552 4725 scope.go:117] "RemoveContainer" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.305716 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.305758 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.305771 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.305790 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.305819 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.409911 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.409953 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.409963 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.409979 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.409989 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.512272 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.512363 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.512382 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.512407 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.512425 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.614984 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.615043 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.615084 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.615109 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.615127 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.718459 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.718555 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.718603 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.718629 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.718650 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.745976 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.749392 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.750266 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.764976 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.773934 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.791794 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.802780 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.813896 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.823247 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.823343 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.823365 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.823436 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.823463 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.827887 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.837377 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.850332 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.925940 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.925979 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.926020 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.926038 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.926050 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:56Z","lastTransitionTime":"2026-02-27T06:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.982340 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.982468 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982524 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:12:04.98249894 +0000 UTC m=+103.445119509 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.982589 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:56 crc kubenswrapper[4725]: I0227 06:11:56.982642 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982663 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982689 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982711 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982761 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982768 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982783 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:04.982761238 +0000 UTC m=+103.445381847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982877 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:04.982867891 +0000 UTC m=+103.445488460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:11:56 crc kubenswrapper[4725]: E0227 06:11:56.982889 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:04.982883342 +0000 UTC m=+103.445503911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.029112 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.029178 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.029200 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.029230 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.029250 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.084095 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:57 crc kubenswrapper[4725]: E0227 06:11:57.084339 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:11:57 crc kubenswrapper[4725]: E0227 06:11:57.084383 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:11:57 crc kubenswrapper[4725]: E0227 06:11:57.084405 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:57 crc kubenswrapper[4725]: E0227 06:11:57.084473 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:05.084451818 +0000 UTC m=+103.547072417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.132880 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.132969 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.132987 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.133012 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.133070 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.236716 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.236760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.236777 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.236799 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.236816 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.250976 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.253665 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:57 crc kubenswrapper[4725]: E0227 06:11:57.253827 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:11:57 crc kubenswrapper[4725]: E0227 06:11:57.258034 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.258210 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:57 crc kubenswrapper[4725]: E0227 06:11:57.258379 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.339811 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.339874 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.339892 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.339914 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.339930 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.444390 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.444438 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.444478 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.444497 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.444509 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.548465 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.548557 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.548587 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.548623 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.548647 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.651737 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.651831 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.651844 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.651883 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.651895 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.754620 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.754671 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.754720 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.754743 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.754761 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.858813 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.858860 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.858875 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.858898 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.858914 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.966462 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.966527 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.966537 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.966555 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:57 crc kubenswrapper[4725]: I0227 06:11:57.966565 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:57Z","lastTransitionTime":"2026-02-27T06:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.069897 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.069940 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.069950 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.069963 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.069974 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.173130 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.173156 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.173165 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.173178 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.173188 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.276077 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.276143 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.276159 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.276178 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.276192 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.378795 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.378856 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.378875 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.378900 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.378917 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.481894 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.481949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.481965 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.481989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.482009 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.584432 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.584505 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.584527 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.584548 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.584565 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.687565 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.687607 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.687617 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.687635 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.687645 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.790677 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.790712 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.790724 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.790737 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.790746 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.892974 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.893003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.893012 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.893028 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.893037 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.994890 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.994957 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.994979 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.995007 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:58 crc kubenswrapper[4725]: I0227 06:11:58.995028 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:58Z","lastTransitionTime":"2026-02-27T06:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.097814 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.097840 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.097849 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.097861 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.097870 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.200239 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.200276 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.200300 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.200317 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.200327 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.250854 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.250897 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.250971 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.251098 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.251548 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.251789 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.270201 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.304500 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.304526 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.304536 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.304548 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.304558 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.407124 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.407243 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.407331 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.407366 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.407420 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.512105 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.512192 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.512211 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.512277 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.512330 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.616003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.616089 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.616110 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.616143 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.616165 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.719093 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.719165 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.719185 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.719213 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.719235 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.821786 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.821846 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.821868 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.821891 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.821908 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.900799 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.900864 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.900886 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.900917 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.900938 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.916630 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.921921 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.922014 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.922031 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.922063 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.922083 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.937336 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.943185 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.943249 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.943276 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.943347 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.943377 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.957409 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.961847 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.961910 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.961935 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.961968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.961995 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.976365 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.981148 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.981218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.981236 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.981264 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.981283 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.990767 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:11:59 crc kubenswrapper[4725]: E0227 06:11:59.991122 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.993629 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.993691 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.993716 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.993747 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:11:59 crc kubenswrapper[4725]: I0227 06:11:59.993767 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:11:59Z","lastTransitionTime":"2026-02-27T06:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.096901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.096958 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.096975 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.097000 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.097018 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.200852 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.200899 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.200915 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.200938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.200955 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.304352 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.304684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.304820 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.304943 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.305070 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.408458 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.409005 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.409154 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.409338 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.409505 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.512841 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.513255 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.513665 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.514012 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.514383 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.618572 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.618955 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.619178 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.619386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.619563 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.722771 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.723134 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.723359 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.723489 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.723632 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.827025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.827405 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.827737 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.827908 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.828078 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.931507 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.932442 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.932578 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.932715 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:00 crc kubenswrapper[4725]: I0227 06:12:00.932852 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:00Z","lastTransitionTime":"2026-02-27T06:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.035584 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.035789 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.035872 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.035947 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.036037 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.138084 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.138384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.138600 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.138697 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.138796 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.242333 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.242647 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.242782 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.242907 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.243020 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.251425 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:01 crc kubenswrapper[4725]: E0227 06:12:01.251711 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.252324 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.252478 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:01 crc kubenswrapper[4725]: E0227 06:12:01.252755 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:01 crc kubenswrapper[4725]: E0227 06:12:01.253922 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.350709 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.351274 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.351320 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.351352 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.351373 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.455844 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.456006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.456025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.456051 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.456070 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.558119 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.558174 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.558194 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.558223 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.558241 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.660853 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.660925 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.660942 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.660968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.660985 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.764566 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.764615 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.764632 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.764656 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.764675 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.766968 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.767047 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.768321 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.785757 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.814908 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.828079 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.845750 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.847348 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gn9fk"] Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.847878 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.851653 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.853055 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.853618 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.869538 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.871323 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.871387 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.871409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.871435 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.871453 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.890208 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.912360 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.931477 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.947581 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whqnj\" (UniqueName: \"kubernetes.io/projected/d2572a7c-3003-4e40-a052-1718a5ef100d-kube-api-access-whqnj\") pod \"node-resolver-gn9fk\" (UID: \"d2572a7c-3003-4e40-a052-1718a5ef100d\") " pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.947684 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2572a7c-3003-4e40-a052-1718a5ef100d-hosts-file\") pod \"node-resolver-gn9fk\" (UID: \"d2572a7c-3003-4e40-a052-1718a5ef100d\") " pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.948482 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.965177 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.974576 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.974630 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.974648 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.974677 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.974697 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:01Z","lastTransitionTime":"2026-02-27T06:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:01 crc kubenswrapper[4725]: I0227 06:12:01.984500 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.008448 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.026431 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.043425 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.048455 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whqnj\" (UniqueName: \"kubernetes.io/projected/d2572a7c-3003-4e40-a052-1718a5ef100d-kube-api-access-whqnj\") pod \"node-resolver-gn9fk\" (UID: \"d2572a7c-3003-4e40-a052-1718a5ef100d\") " pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.048521 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2572a7c-3003-4e40-a052-1718a5ef100d-hosts-file\") pod \"node-resolver-gn9fk\" (UID: \"d2572a7c-3003-4e40-a052-1718a5ef100d\") " pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.048627 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2572a7c-3003-4e40-a052-1718a5ef100d-hosts-file\") pod \"node-resolver-gn9fk\" (UID: \"d2572a7c-3003-4e40-a052-1718a5ef100d\") " pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.058682 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.075055 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.077635 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.077679 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.077695 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.077719 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.077737 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.082987 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whqnj\" (UniqueName: \"kubernetes.io/projected/d2572a7c-3003-4e40-a052-1718a5ef100d-kube-api-access-whqnj\") pod \"node-resolver-gn9fk\" (UID: \"d2572a7c-3003-4e40-a052-1718a5ef100d\") " pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.093618 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.115523 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.147103 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.169668 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gn9fk" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.180378 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.180453 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.180470 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.180508 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.180526 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: W0227 06:12:02.187760 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2572a7c_3003_4e40_a052_1718a5ef100d.slice/crio-83369130143bb27b78f81c6e84fbd36381af656509488f6f4f0d4bb200a1b8c4 WatchSource:0}: Error finding container 83369130143bb27b78f81c6e84fbd36381af656509488f6f4f0d4bb200a1b8c4: Status 404 returned error can't find the container with id 83369130143bb27b78f81c6e84fbd36381af656509488f6f4f0d4bb200a1b8c4 Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.228578 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mg969"] Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.228981 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.231322 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-gxjff"] Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.231958 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.232447 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.232592 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-g8jqm"] Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.232713 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.232973 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.233130 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.233396 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.235095 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.237393 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.237674 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.239345 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.239672 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.239875 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.240151 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.242271 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.250407 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.283118 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.283167 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.283184 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.283208 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.283226 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.290982 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.310776 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.326931 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.342270 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350759 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rswtk\" (UniqueName: \"kubernetes.io/projected/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-kube-api-access-rswtk\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350803 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cnibin\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350836 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-system-cni-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350857 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-os-release\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350877 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-mcd-auth-proxy-config\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350897 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-os-release\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350916 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-socket-dir-parent\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350934 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-cni-bin\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350952 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-kubelet\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350972 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.350993 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351011 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-rootfs\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351029 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrg6\" (UniqueName: \"kubernetes.io/projected/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-kube-api-access-mwrg6\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351048 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-proxy-tls\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351067 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-cnibin\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351097 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-netns\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351115 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-hostroot\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351146 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7439e599-9b13-45e6-8f71-ef3760b2235b-cni-binary-copy\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351168 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-cni-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351186 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-daemon-config\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351204 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351231 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-etc-kubernetes\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351249 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-system-cni-dir\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351268 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-conf-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351306 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-multus-certs\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351326 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-cni-multus\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351358 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqtx\" (UniqueName: \"kubernetes.io/projected/7439e599-9b13-45e6-8f71-ef3760b2235b-kube-api-access-bhqtx\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.351378 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-k8s-cni-cncf-io\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.360140 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.378130 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.385386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.385428 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.385439 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.385457 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.385508 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.394029 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.415247 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.430270 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.444278 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.451735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-cnibin\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.451823 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-netns\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.451861 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-hostroot\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.451896 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7439e599-9b13-45e6-8f71-ef3760b2235b-cni-binary-copy\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.451929 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-cni-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.451962 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-daemon-config\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.451996 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452047 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-etc-kubernetes\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452081 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-system-cni-dir\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452117 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-conf-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452149 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-multus-certs\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452181 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-cni-multus\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452227 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqtx\" (UniqueName: \"kubernetes.io/projected/7439e599-9b13-45e6-8f71-ef3760b2235b-kube-api-access-bhqtx\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452262 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-k8s-cni-cncf-io\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452319 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rswtk\" (UniqueName: \"kubernetes.io/projected/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-kube-api-access-rswtk\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452360 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cnibin\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452406 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-system-cni-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452440 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-os-release\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452480 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-mcd-auth-proxy-config\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452518 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-os-release\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452558 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-socket-dir-parent\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452598 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-cni-bin\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452637 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-kubelet\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452677 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452715 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452746 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-rootfs\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452774 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrg6\" (UniqueName: \"kubernetes.io/projected/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-kube-api-access-mwrg6\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.452806 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-proxy-tls\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.453567 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-k8s-cni-cncf-io\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.454427 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-os-release\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.454619 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-cni-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.454661 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-hostroot\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.454706 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-cnibin\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.454733 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-netns\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.454803 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.454937 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-mcd-auth-proxy-config\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455023 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-socket-dir-parent\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455104 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-cni-bin\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455191 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-etc-kubernetes\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455231 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-system-cni-dir\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455261 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-conf-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455318 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-run-multus-certs\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455389 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-cni-multus\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455429 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-os-release\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455434 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-rootfs\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455454 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-host-var-lib-kubelet\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455469 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7439e599-9b13-45e6-8f71-ef3760b2235b-system-cni-dir\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.455493 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cnibin\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.456395 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7439e599-9b13-45e6-8f71-ef3760b2235b-multus-daemon-config\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.460083 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.463562 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.468486 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.469729 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rswtk\" (UniqueName: \"kubernetes.io/projected/5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c-kube-api-access-rswtk\") pod \"multus-additional-cni-plugins-gxjff\" (UID: \"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\") " pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.470500 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-proxy-tls\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.470843 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7439e599-9b13-45e6-8f71-ef3760b2235b-cni-binary-copy\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.473439 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrg6\" (UniqueName: \"kubernetes.io/projected/6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198-kube-api-access-mwrg6\") pod \"machine-config-daemon-mg969\" (UID: \"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\") " pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.474110 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqtx\" (UniqueName: \"kubernetes.io/projected/7439e599-9b13-45e6-8f71-ef3760b2235b-kube-api-access-bhqtx\") pod \"multus-g8jqm\" (UID: \"7439e599-9b13-45e6-8f71-ef3760b2235b\") " pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.481695 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.487598 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.487628 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.487639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.487656 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.487668 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.494026 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.503804 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.518254 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.529850 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.541213 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.552785 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.553389 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.563169 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g8jqm" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.565027 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: W0227 06:12:02.568886 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c825ee8_1ec6_4b76_9fdc_1f5dab0b3198.slice/crio-6fbecaf66f677cf89d357943c98237c20b476744061cb307fabd1822275fac6c WatchSource:0}: Error finding container 6fbecaf66f677cf89d357943c98237c20b476744061cb307fabd1822275fac6c: Status 404 returned error can't find the container with id 6fbecaf66f677cf89d357943c98237c20b476744061cb307fabd1822275fac6c Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.572764 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gxjff" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.580514 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: W0227 06:12:02.593359 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7439e599_9b13_45e6_8f71_ef3760b2235b.slice/crio-444f49119b75e7f18ab7be514e8fd922884971fec04c0d100d4ee894f02ce951 WatchSource:0}: Error finding container 444f49119b75e7f18ab7be514e8fd922884971fec04c0d100d4ee894f02ce951: Status 404 returned error can't find the container with id 444f49119b75e7f18ab7be514e8fd922884971fec04c0d100d4ee894f02ce951 Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.595561 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.595597 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.595605 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.595619 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.595631 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: W0227 06:12:02.597789 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c170c9d_f4fe_48cb_b9f5_8c5e7d54ea8c.slice/crio-15718218ccd5fbb00ae4810c5f8884e1521855a651b01ee20be9f06f0e90e801 WatchSource:0}: Error finding container 15718218ccd5fbb00ae4810c5f8884e1521855a651b01ee20be9f06f0e90e801: Status 404 returned error can't find the container with id 15718218ccd5fbb00ae4810c5f8884e1521855a651b01ee20be9f06f0e90e801 Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.597889 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.608134 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.627108 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lchm9"] Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.627823 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.628307 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.631800 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.632218 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.632512 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.633677 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.633704 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.633883 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.634241 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.643909 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.679698 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.702672 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.704380 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.704409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.704419 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.704436 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.704448 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.749304 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.763813 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-systemd-units\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.763868 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.763897 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.763919 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-log-socket\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.763938 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-env-overrides\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.763956 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbt4v\" (UniqueName: \"kubernetes.io/projected/05a446dc-e501-4173-a911-7b33ca4608c6-kube-api-access-sbt4v\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.763991 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-slash\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764010 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-var-lib-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764054 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-config\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764087 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-kubelet\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764107 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-netns\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764125 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-ovn\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764156 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-systemd\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764178 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-node-log\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764200 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05a446dc-e501-4173-a911-7b33ca4608c6-ovn-node-metrics-cert\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764231 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-script-lib\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764216 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764254 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-bin\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764304 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-etc-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.764328 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-netd\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.776706 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.778314 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerStarted","Data":"15718218ccd5fbb00ae4810c5f8884e1521855a651b01ee20be9f06f0e90e801"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.780051 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerStarted","Data":"41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.780104 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerStarted","Data":"444f49119b75e7f18ab7be514e8fd922884971fec04c0d100d4ee894f02ce951"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.782905 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.782937 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"6fbecaf66f677cf89d357943c98237c20b476744061cb307fabd1822275fac6c"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.785894 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gn9fk" event={"ID":"d2572a7c-3003-4e40-a052-1718a5ef100d","Type":"ContainerStarted","Data":"86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.785918 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gn9fk" event={"ID":"d2572a7c-3003-4e40-a052-1718a5ef100d","Type":"ContainerStarted","Data":"83369130143bb27b78f81c6e84fbd36381af656509488f6f4f0d4bb200a1b8c4"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.792307 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.806145 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.806949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.806975 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.806987 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.807004 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.807016 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.818067 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.829834 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.844876 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.856297 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865525 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-systemd-units\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865563 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865607 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865626 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-log-socket\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865644 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-env-overrides\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865659 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbt4v\" (UniqueName: \"kubernetes.io/projected/05a446dc-e501-4173-a911-7b33ca4608c6-kube-api-access-sbt4v\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865649 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-systemd-units\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865685 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-slash\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865702 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-var-lib-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865745 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-var-lib-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865785 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865799 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-log-socket\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865796 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.865818 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-config\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866024 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866079 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-kubelet\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866048 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-kubelet\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866147 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-netns\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866180 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-ovn\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866263 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-systemd\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866311 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-node-log\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866334 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05a446dc-e501-4173-a911-7b33ca4608c6-ovn-node-metrics-cert\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866356 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-env-overrides\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-script-lib\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866410 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-bin\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-etc-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866459 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-netd\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866505 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-netd\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866576 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866706 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-config\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866762 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-slash\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866769 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-node-log\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866799 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-systemd\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866825 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-bin\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866850 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-etc-openvswitch\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866889 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-netns\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866922 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-ovn\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.866999 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-script-lib\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.871886 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.872210 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05a446dc-e501-4173-a911-7b33ca4608c6-ovn-node-metrics-cert\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.883503 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.899642 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.909151 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.909193 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.909201 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.909217 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.909225 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:02Z","lastTransitionTime":"2026-02-27T06:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.919705 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.934424 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.945992 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.971742 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.976408 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbt4v\" (UniqueName: \"kubernetes.io/projected/05a446dc-e501-4173-a911-7b33ca4608c6-kube-api-access-sbt4v\") pod \"ovnkube-node-lchm9\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.979679 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:02 crc kubenswrapper[4725]: I0227 06:12:02.999624 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.012433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.012491 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.012509 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.012534 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.012552 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.016425 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.033841 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.048229 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.066119 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: W0227 06:12:03.080112 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a446dc_e501_4173_a911_7b33ca4608c6.slice/crio-35e552ee2754b532b808590a888e54469cd21b2f24d9cd0ffc830180f60958a9 WatchSource:0}: Error finding container 35e552ee2754b532b808590a888e54469cd21b2f24d9cd0ffc830180f60958a9: Status 404 returned error can't find the container with id 35e552ee2754b532b808590a888e54469cd21b2f24d9cd0ffc830180f60958a9 Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.083350 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.096919 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.115491 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.115543 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.115554 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.115569 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.115580 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.123768 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.217939 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.218018 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.218038 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.218061 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.218075 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.250569 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.250668 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:03 crc kubenswrapper[4725]: E0227 06:12:03.250718 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:03 crc kubenswrapper[4725]: E0227 06:12:03.250818 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.250575 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:03 crc kubenswrapper[4725]: E0227 06:12:03.250926 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.321505 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.321556 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.321567 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.321584 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.321596 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.424003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.424036 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.424045 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.424058 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.424066 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.527113 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.527153 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.527161 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.527175 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.527184 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.629323 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.629362 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.629370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.629385 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.629394 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.731520 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.731563 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.731575 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.731588 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.731597 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.789831 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c" containerID="fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561" exitCode=0 Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.789910 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerDied","Data":"fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.792062 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed" exitCode=0 Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.792121 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.792148 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"35e552ee2754b532b808590a888e54469cd21b2f24d9cd0ffc830180f60958a9"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.795711 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.808922 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.832768 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.834690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.834741 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.834759 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.834786 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.834803 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.857865 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.879714 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.900960 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.917687 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.936960 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.937007 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.937025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.937049 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.937067 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:03Z","lastTransitionTime":"2026-02-27T06:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.938715 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.951161 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.971627 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:03 crc kubenswrapper[4725]: I0227 06:12:03.989633 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.008035 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.025455 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.039275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.039373 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.039395 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.039422 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.039442 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.041440 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.064947 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.088626 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.107110 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.126840 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.142893 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.142948 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.142967 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.142992 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.143010 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.162844 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.179146 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.198265 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.221744 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.241111 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.245873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.245950 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.245968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.245992 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.246009 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.261435 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.274984 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.292066 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.312374 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.327030 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.349686 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.350726 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.350798 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.350816 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.350840 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.350859 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.453996 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.454158 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.454269 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.454411 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.454516 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.557236 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.557313 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.557332 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.557357 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.557376 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.660076 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.660275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.660430 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.660520 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.660717 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.763098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.763433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.763444 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.763457 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.763467 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.800141 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.802567 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c" containerID="bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541" exitCode=0 Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.802653 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerDied","Data":"bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.811983 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.812047 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.812068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.812091 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.812111 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.822434 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.859925 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.865790 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.865834 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.865849 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.865867 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.865879 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.877061 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.903976 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.928978 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.948735 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.965421 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:04Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.968683 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.968719 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.968728 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.968749 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.968761 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:04Z","lastTransitionTime":"2026-02-27T06:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.988790 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.988898 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:12:20.9888725 +0000 UTC m=+119.451493069 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.989000 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.989094 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:04 crc kubenswrapper[4725]: I0227 06:12:04.989148 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989407 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989447 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989471 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989494 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989563 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:20.989536029 +0000 UTC m=+119.452156628 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989602 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:20.989584591 +0000 UTC m=+119.452205410 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989608 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:12:04 crc kubenswrapper[4725]: E0227 06:12:04.989711 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:20.989682383 +0000 UTC m=+119.452302992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.015855 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.034933 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.050178 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.067066 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.071691 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.071729 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.071743 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.071761 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.071777 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.083365 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.090506 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:05 crc kubenswrapper[4725]: E0227 06:12:05.090701 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:12:05 crc kubenswrapper[4725]: E0227 06:12:05.090737 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:12:05 crc kubenswrapper[4725]: E0227 06:12:05.090754 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:05 crc kubenswrapper[4725]: E0227 06:12:05.090828 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:21.090803667 +0000 UTC m=+119.553424246 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.105020 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.122101 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.147682 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.165128 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.174087 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.174128 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.174140 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.174159 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.174172 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.182018 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.199489 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.221912 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.247106 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.250535 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.250551 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.250657 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:05 crc kubenswrapper[4725]: E0227 06:12:05.251050 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:05 crc kubenswrapper[4725]: E0227 06:12:05.250862 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:05 crc kubenswrapper[4725]: E0227 06:12:05.251240 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.276756 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.276960 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.277018 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.277101 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.277156 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.283607 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.298098 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.310576 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.323116 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.339431 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.356889 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.377213 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.379734 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.379917 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.380050 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.380178 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.380348 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.396316 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.484199 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.484505 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.484649 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.484832 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.484984 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.587551 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.587634 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.587653 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.587679 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.587698 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.691162 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.691221 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.691239 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.691267 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.691369 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.794159 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.794198 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.794212 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.794229 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.794242 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.817466 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.819426 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c" containerID="6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d" exitCode=0 Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.819458 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerDied","Data":"6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.846991 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.863464 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.880865 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.898269 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.898370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.898394 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.898425 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.898451 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:05Z","lastTransitionTime":"2026-02-27T06:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.900516 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.919234 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.931019 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.947797 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.967108 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:05 crc kubenswrapper[4725]: I0227 06:12:05.980286 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:05Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.001657 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.001686 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.001697 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.001713 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.001725 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.005072 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.036669 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.059814 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.079400 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.095984 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.104702 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.104744 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.104757 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.104776 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.104786 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.206944 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.206980 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.206990 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.207003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.207014 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.309605 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.309642 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.309656 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.309674 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.309687 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.412760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.412827 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.412845 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.412871 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.412891 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.516777 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.516844 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.516862 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.516893 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.516911 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.603567 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.619998 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.620061 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.620083 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.620114 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.620137 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.620881 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.642125 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.666707 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.686219 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.705592 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.723116 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.723535 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.723575 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.723588 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.723608 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.723622 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.746696 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.763141 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.785821 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.805389 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.821849 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.826709 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.826770 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.826794 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.826825 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.826848 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.829131 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c" containerID="d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745" exitCode=0 Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.829195 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerDied","Data":"d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.842857 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.856069 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.884383 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.902498 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.918539 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.929529 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.929605 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.929628 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.929658 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.929680 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:06Z","lastTransitionTime":"2026-02-27T06:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.953720 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.972630 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:06 crc kubenswrapper[4725]: I0227 06:12:06.987246 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:06Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.005021 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.032715 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.032786 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.032802 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.032824 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.032839 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.035372 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.049872 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.075681 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.099600 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.114461 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.127065 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.135528 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.135603 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.135627 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.135658 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.135682 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.138011 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.154784 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.238630 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.238676 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.238689 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.238710 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.238722 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.251192 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.251190 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:07 crc kubenswrapper[4725]: E0227 06:12:07.251372 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:07 crc kubenswrapper[4725]: E0227 06:12:07.251502 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.251211 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:07 crc kubenswrapper[4725]: E0227 06:12:07.251659 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.341097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.341161 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.341180 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.341208 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.341230 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.448117 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.448932 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.448945 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.448964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.448975 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.551751 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.551812 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.551831 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.551854 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.551873 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.655039 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.655077 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.655088 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.655106 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.655120 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.757260 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.757347 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.757368 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.757392 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.757411 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.838727 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.842992 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c" containerID="e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6" exitCode=0 Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.843062 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerDied","Data":"e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.859872 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.859934 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.859951 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.859977 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.859995 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.872960 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.894356 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.911872 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.938682 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.954777 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.967034 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.967094 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.967112 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.967136 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.967155 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:07Z","lastTransitionTime":"2026-02-27T06:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.972489 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:07 crc kubenswrapper[4725]: I0227 06:12:07.986437 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:07Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.005264 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.016711 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.029034 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.041486 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.059213 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.069724 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.070377 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.070408 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.070420 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.070436 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.070447 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.085639 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.173263 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.173320 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.173335 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.173353 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.173364 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.276808 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.276844 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.276853 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.276869 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.276879 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.379801 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.379841 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.379850 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.379866 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.379876 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.482465 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.482514 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.482522 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.482537 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.482548 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.585179 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.585259 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.585312 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.585351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.585437 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.688999 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.689065 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.689085 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.689113 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.689134 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.774764 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zpdks"] Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.775365 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.778571 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.778753 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.778941 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.778970 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.793972 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.794021 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.794038 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.794061 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.794081 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.798484 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.815220 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.851604 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c" containerID="4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3" exitCode=0 Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.851659 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.851822 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerDied","Data":"4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.874047 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.897245 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.898115 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.898359 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.898572 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.898799 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:08Z","lastTransitionTime":"2026-02-27T06:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.905480 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.926053 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.933044 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4n5\" (UniqueName: \"kubernetes.io/projected/3dcaad94-78ba-48c7-aac6-9d8352419ed5-kube-api-access-wm4n5\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.933162 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dcaad94-78ba-48c7-aac6-9d8352419ed5-serviceca\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.933274 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dcaad94-78ba-48c7-aac6-9d8352419ed5-host\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.942578 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.965076 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:08 crc kubenswrapper[4725]: I0227 06:12:08.986188 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:08Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.002376 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.002410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.002426 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.002449 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.002465 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.009002 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.034755 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4n5\" (UniqueName: \"kubernetes.io/projected/3dcaad94-78ba-48c7-aac6-9d8352419ed5-kube-api-access-wm4n5\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.034812 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dcaad94-78ba-48c7-aac6-9d8352419ed5-serviceca\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.034881 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dcaad94-78ba-48c7-aac6-9d8352419ed5-host\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.034983 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3dcaad94-78ba-48c7-aac6-9d8352419ed5-host\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.036878 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3dcaad94-78ba-48c7-aac6-9d8352419ed5-serviceca\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.039257 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.058620 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.069949 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4n5\" (UniqueName: \"kubernetes.io/projected/3dcaad94-78ba-48c7-aac6-9d8352419ed5-kube-api-access-wm4n5\") pod \"node-ca-zpdks\" (UID: \"3dcaad94-78ba-48c7-aac6-9d8352419ed5\") " pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.079373 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.096529 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zpdks" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.098554 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.106263 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.106368 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.106394 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.106422 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.106440 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: W0227 06:12:09.118684 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcaad94_78ba_48c7_aac6_9d8352419ed5.slice/crio-807cac12c7d91159fe6ee0adbcfa41f36eb500ee6201e7fc9cc094de49a0cb4e WatchSource:0}: Error finding container 807cac12c7d91159fe6ee0adbcfa41f36eb500ee6201e7fc9cc094de49a0cb4e: Status 404 returned error can't find the container with id 807cac12c7d91159fe6ee0adbcfa41f36eb500ee6201e7fc9cc094de49a0cb4e Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.119057 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.140551 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.160106 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.179989 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.210218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.210266 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.210277 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.210315 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.210327 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.212883 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.228381 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.239955 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.251528 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:09 crc kubenswrapper[4725]: E0227 06:12:09.251652 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.251931 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:09 crc kubenswrapper[4725]: E0227 06:12:09.251983 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.252008 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:09 crc kubenswrapper[4725]: E0227 06:12:09.252066 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.252178 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.264356 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.280443 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.296571 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.312863 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.313172 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.313210 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.313225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.313245 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.313258 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.325504 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.348661 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.358992 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.375153 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.415938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.416000 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.416019 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.416047 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.416065 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.526675 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.526741 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.526760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.526782 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.526801 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.630235 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.630323 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.630374 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.630409 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.630431 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.732463 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.732491 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.732499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.732511 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.732521 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.835990 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.836073 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.836097 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.836126 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.836153 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.859651 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.860064 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.860091 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.867155 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" event={"ID":"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c","Type":"ContainerStarted","Data":"0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.873222 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zpdks" event={"ID":"3dcaad94-78ba-48c7-aac6-9d8352419ed5","Type":"ContainerStarted","Data":"312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.873310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zpdks" event={"ID":"3dcaad94-78ba-48c7-aac6-9d8352419ed5","Type":"ContainerStarted","Data":"807cac12c7d91159fe6ee0adbcfa41f36eb500ee6201e7fc9cc094de49a0cb4e"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.885724 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.901801 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.906814 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.921518 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.938852 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.939119 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.939130 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.939148 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.939159 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:09Z","lastTransitionTime":"2026-02-27T06:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.956528 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.971454 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.979922 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.989845 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:09 crc kubenswrapper[4725]: I0227 06:12:09.997572 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:09Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.008656 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.018754 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.027769 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.037437 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.040771 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.040796 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.040804 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.040816 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.040825 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.055416 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.070366 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.088776 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.106505 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.119328 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.138413 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.143084 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.143150 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.143169 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.143195 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.143213 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.155957 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.156025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.156042 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.156070 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.156086 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.160971 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: E0227 06:12:10.174203 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.177452 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.178941 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.178989 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.179002 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.179025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.179047 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.192099 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: E0227 06:12:10.197891 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.204364 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.204424 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.204446 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.204476 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.204496 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.214901 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: E0227 06:12:10.226182 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.240018 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.240096 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.240122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.240154 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.240177 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.263843 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: E0227 06:12:10.275942 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.279431 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.279493 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.279512 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.279545 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.279574 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.283160 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: E0227 06:12:10.293043 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: E0227 06:12:10.293149 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.294832 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.294860 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.294869 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.294883 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.294894 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.295931 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.309360 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.320190 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.332880 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.346177 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.368492 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.398566 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.398631 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.398654 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.398682 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.398700 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.512351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.512477 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.512503 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.512541 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.512565 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.621645 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.621709 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.621724 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.621751 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.621763 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.724768 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.724839 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.724857 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.724883 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.724901 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.830632 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.830714 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.830735 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.830760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.830776 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.877884 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.918610 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.932874 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.932913 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.932925 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.932944 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.932955 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:10Z","lastTransitionTime":"2026-02-27T06:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.936036 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.954919 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.974139 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:10 crc kubenswrapper[4725]: I0227 06:12:10.987278 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:10Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.005593 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.028031 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.035265 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.035353 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.035375 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.035405 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.035423 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.061597 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.075175 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.088723 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.104294 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.125954 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.137988 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.138070 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.138091 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.138126 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.138148 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.142612 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.161160 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.176011 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.195674 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:11Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.241222 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.241327 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.241348 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.241375 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.241398 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.251011 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.251011 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.251029 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:11 crc kubenswrapper[4725]: E0227 06:12:11.251352 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:11 crc kubenswrapper[4725]: E0227 06:12:11.251494 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:11 crc kubenswrapper[4725]: E0227 06:12:11.251607 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.346533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.346627 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.346654 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.346694 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.346721 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.451227 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.451322 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.451344 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.451383 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.451404 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.554961 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.555017 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.555036 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.555064 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.555089 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.660931 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.660992 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.661007 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.661033 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.661047 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.763979 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.764036 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.764052 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.764082 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.764099 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.866814 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.866878 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.866895 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.866920 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.866938 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.970270 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.970354 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.970371 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.970998 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:11 crc kubenswrapper[4725]: I0227 06:12:11.971058 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:11Z","lastTransitionTime":"2026-02-27T06:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.073862 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.073919 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.073936 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.073960 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.073977 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.176843 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.176910 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.176927 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.176952 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.176973 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.273333 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.279634 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.279686 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.279697 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.279718 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.279730 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.288580 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.301420 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.316262 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.330599 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.350105 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.367603 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.382867 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.382971 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.382995 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.383021 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.383040 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.389157 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.423756 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.440323 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.457917 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.477904 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.485379 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.485420 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.485434 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.485454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.485468 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.509849 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.533894 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.553160 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.588211 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.588262 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.588273 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.588307 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.588322 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.692066 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.692131 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.692151 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.692173 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.692192 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.794844 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.794906 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.794924 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.794949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.794966 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.886648 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/0.log" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.891269 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8" exitCode=1 Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.891386 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.892593 4725 scope.go:117] "RemoveContainer" containerID="3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.899012 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.899060 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.899078 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.899112 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.899135 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:12Z","lastTransitionTime":"2026-02-27T06:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.928627 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.948634 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.968778 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:12 crc kubenswrapper[4725]: I0227 06:12:12.986628 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.005342 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.005456 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.005480 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.005554 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.005579 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.012695 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.041774 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.066256 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.088559 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.108078 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.111030 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.111130 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.111150 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.111178 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.111197 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.125486 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.147490 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.176242 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:12Z\\\",\\\"message\\\":\\\"541712 6543 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 06:12:12.541735 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 06:12:12.543585 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543638 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543640 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 06:12:12.543679 6543 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 06:12:12.543691 6543 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 06:12:12.543716 6543 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:12.543732 6543 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:12.543750 6543 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 06:12:12.543772 6543 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 06:12:12.543764 6543 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:12.543790 6543 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 06:12:12.543805 6543 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:12.544598 6543 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.191921 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.212423 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.214082 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.214138 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.214164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.214194 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.214221 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.232548 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.250808 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.250818 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:13 crc kubenswrapper[4725]: E0227 06:12:13.250999 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.250818 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:13 crc kubenswrapper[4725]: E0227 06:12:13.251126 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:13 crc kubenswrapper[4725]: E0227 06:12:13.251254 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.317540 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.317606 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.317621 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.317642 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.317654 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.420463 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.420583 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.420601 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.420634 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.420650 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.523741 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.523813 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.523827 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.523853 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.523871 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.626403 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.626454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.626466 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.626486 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.626498 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.729499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.729550 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.729562 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.729583 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.729595 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.832233 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.832341 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.832360 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.832386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.832407 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.899414 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/0.log" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.904321 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.905005 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.926411 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.935539 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.935600 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.935623 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.935649 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.935669 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:13Z","lastTransitionTime":"2026-02-27T06:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.945046 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.976578 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:13 crc kubenswrapper[4725]: I0227 06:12:13.992218 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:13Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.010164 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.027904 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.038550 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.038606 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.038624 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.038647 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.038664 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.043211 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.066350 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.090637 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.111551 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.140832 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.140982 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.141006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.141023 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.141036 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.147996 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:12Z\\\",\\\"message\\\":\\\"541712 6543 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 06:12:12.541735 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 06:12:12.543585 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543638 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543640 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 06:12:12.543679 6543 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 06:12:12.543691 6543 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 06:12:12.543716 6543 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:12.543732 6543 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:12.543750 6543 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 06:12:12.543772 6543 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 06:12:12.543764 6543 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:12.543790 6543 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 06:12:12.543805 6543 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:12.544598 6543 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.165142 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.184438 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.202208 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.219650 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.243730 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.243781 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.243799 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.243823 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.243873 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.346766 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.346860 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.346879 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.346944 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.346964 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.449779 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.449828 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.449847 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.449871 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.449888 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.553122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.553190 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.553210 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.553237 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.553256 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.656536 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.656596 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.656614 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.656636 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.656653 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.760564 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.760639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.760662 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.760690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.760710 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.863980 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.864062 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.864091 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.864122 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.864144 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.911076 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/1.log" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.911894 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/0.log" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.915713 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7" exitCode=1 Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.915770 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.915818 4725 scope.go:117] "RemoveContainer" containerID="3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.918484 4725 scope.go:117] "RemoveContainer" containerID="da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7" Feb 27 06:12:14 crc kubenswrapper[4725]: E0227 06:12:14.918878 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.928762 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk"] Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.929183 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.931772 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.931960 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.941200 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.963221 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.966998 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.967047 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.967063 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.967085 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.967099 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:14Z","lastTransitionTime":"2026-02-27T06:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:14 crc kubenswrapper[4725]: I0227 06:12:14.984573 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:14Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.004193 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.020842 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.044634 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.062433 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.070129 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.070162 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.070173 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.070192 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.070204 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.082904 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.105380 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0b5a9f66-b388-4334-82c5-d3c8de8d86be-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.105470 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b5a9f66-b388-4334-82c5-d3c8de8d86be-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.105502 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0b5a9f66-b388-4334-82c5-d3c8de8d86be-env-overrides\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.105570 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knvjj\" (UniqueName: \"kubernetes.io/projected/0b5a9f66-b388-4334-82c5-d3c8de8d86be-kube-api-access-knvjj\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.115021 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:12Z\\\",\\\"message\\\":\\\"541712 6543 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 06:12:12.541735 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 06:12:12.543585 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543638 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543640 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 06:12:12.543679 6543 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 06:12:12.543691 6543 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 06:12:12.543716 6543 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:12.543732 6543 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:12.543750 6543 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 06:12:12.543772 6543 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 06:12:12.543764 6543 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:12.543790 6543 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 06:12:12.543805 6543 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:12.544598 6543 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.136985 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.156687 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.173365 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.173421 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.173439 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.173464 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.173484 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.178826 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.206270 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knvjj\" (UniqueName: \"kubernetes.io/projected/0b5a9f66-b388-4334-82c5-d3c8de8d86be-kube-api-access-knvjj\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.206392 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0b5a9f66-b388-4334-82c5-d3c8de8d86be-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.206449 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0b5a9f66-b388-4334-82c5-d3c8de8d86be-env-overrides\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.206487 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b5a9f66-b388-4334-82c5-d3c8de8d86be-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.207802 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0b5a9f66-b388-4334-82c5-d3c8de8d86be-env-overrides\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.212013 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.215331 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b5a9f66-b388-4334-82c5-d3c8de8d86be-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.217856 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0b5a9f66-b388-4334-82c5-d3c8de8d86be-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.235001 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.237882 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knvjj\" (UniqueName: \"kubernetes.io/projected/0b5a9f66-b388-4334-82c5-d3c8de8d86be-kube-api-access-knvjj\") pod \"ovnkube-control-plane-749d76644c-phgmk\" (UID: \"0b5a9f66-b388-4334-82c5-d3c8de8d86be\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.250481 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.250447 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: E0227 06:12:15.250654 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.250721 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.250834 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:15 crc kubenswrapper[4725]: E0227 06:12:15.251024 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:15 crc kubenswrapper[4725]: E0227 06:12:15.251161 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.254614 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.266208 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: W0227 06:12:15.273725 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b5a9f66_b388_4334_82c5_d3c8de8d86be.slice/crio-bbbdaef9363c487e6bee53761967c9404f59abea03c96f71fd859a5c9aed9e67 WatchSource:0}: Error finding container bbbdaef9363c487e6bee53761967c9404f59abea03c96f71fd859a5c9aed9e67: Status 404 returned error can't find the container with id bbbdaef9363c487e6bee53761967c9404f59abea03c96f71fd859a5c9aed9e67 Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.277780 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.277843 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.277861 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.277880 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.277893 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.282954 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.298187 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.312730 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.327057 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.351993 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.368905 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.380757 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.380792 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.380805 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.380820 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.380832 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.384381 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.399863 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.414029 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.437000 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.458806 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.477632 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.483464 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.483499 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.483509 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.483527 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.483539 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.501799 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.524032 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:12Z\\\",\\\"message\\\":\\\"541712 6543 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 06:12:12.541735 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 06:12:12.543585 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543638 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543640 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 06:12:12.543679 6543 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 06:12:12.543691 6543 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 06:12:12.543716 6543 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:12.543732 6543 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:12.543750 6543 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 06:12:12.543772 6543 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 06:12:12.543764 6543 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:12.543790 6543 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 06:12:12.543805 6543 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:12.544598 6543 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.539266 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.586947 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.587029 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.587055 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.587086 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.587110 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.683256 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vcl2g"] Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.684050 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:15 crc kubenswrapper[4725]: E0227 06:12:15.684180 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.690220 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.690341 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.690369 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.690402 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.690424 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.701976 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.733988 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.754976 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.774930 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.793461 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.793517 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.793534 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.793557 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.793576 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.800310 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.812861 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42fz8\" (UniqueName: \"kubernetes.io/projected/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-kube-api-access-42fz8\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.812975 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.821063 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.851478 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.870549 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.885485 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.896838 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.896908 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.896932 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.896964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.896988 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:15Z","lastTransitionTime":"2026-02-27T06:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.902183 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.913650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.913733 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42fz8\" (UniqueName: \"kubernetes.io/projected/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-kube-api-access-42fz8\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:15 crc kubenswrapper[4725]: E0227 06:12:15.913910 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:15 crc kubenswrapper[4725]: E0227 06:12:15.914020 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:16.413991277 +0000 UTC m=+114.876611876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.925663 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/1.log" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.940686 4725 scope.go:117] "RemoveContainer" containerID="da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7" Feb 27 06:12:15 crc kubenswrapper[4725]: E0227 06:12:15.940955 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.942106 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" event={"ID":"0b5a9f66-b388-4334-82c5-d3c8de8d86be","Type":"ContainerStarted","Data":"fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.942182 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" event={"ID":"0b5a9f66-b388-4334-82c5-d3c8de8d86be","Type":"ContainerStarted","Data":"baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.942205 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" event={"ID":"0b5a9f66-b388-4334-82c5-d3c8de8d86be","Type":"ContainerStarted","Data":"bbbdaef9363c487e6bee53761967c9404f59abea03c96f71fd859a5c9aed9e67"} Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.942462 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42fz8\" (UniqueName: \"kubernetes.io/projected/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-kube-api-access-42fz8\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.947460 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cced5a4af19817d7638cfc7fb5e2c5cdb9b7ac62d40f69ecd8fdbbfd7ce2fa8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:12Z\\\",\\\"message\\\":\\\"541712 6543 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 06:12:12.541735 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 06:12:12.543585 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543638 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:12.543640 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 06:12:12.543679 6543 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 06:12:12.543691 6543 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 06:12:12.543716 6543 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:12.543732 6543 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:12.543750 6543 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 06:12:12.543772 6543 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 06:12:12.543764 6543 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:12.543790 6543 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 06:12:12.543805 6543 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:12.544598 6543 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.964174 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.984315 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:15 crc kubenswrapper[4725]: I0227 06:12:15.999938 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:15.999990 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.000007 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.000029 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.000046 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.002211 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:15Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.016096 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.036027 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.059623 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.081414 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.096052 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.102684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.102731 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.102750 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.102776 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.102793 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.114266 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.127152 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.140890 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.153991 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.166178 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.185427 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.204829 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.204937 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.204964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.204996 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.205021 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.216740 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.233263 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.251015 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.272401 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.289066 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.307755 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.307852 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.307870 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.307894 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.307913 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.321737 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.338206 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.353203 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.369603 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:16Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.410439 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.410801 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.411003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.411175 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.411395 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.419088 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:16 crc kubenswrapper[4725]: E0227 06:12:16.419393 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:16 crc kubenswrapper[4725]: E0227 06:12:16.419499 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:17.419474489 +0000 UTC m=+115.882095088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.514439 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.514502 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.514525 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.514555 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.514579 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.617447 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.617546 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.617564 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.617590 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.617613 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.721741 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.722071 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.722090 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.722114 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.722134 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.824901 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.824959 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.824975 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.824998 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.825016 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.927458 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.927559 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.927582 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.927613 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:16 crc kubenswrapper[4725]: I0227 06:12:16.927645 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:16Z","lastTransitionTime":"2026-02-27T06:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.032378 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.032433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.032450 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.032473 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.032490 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.135873 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.135973 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.135992 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.136016 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.136033 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.242768 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.242836 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.242864 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.242889 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.242908 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.250571 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.250583 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.250700 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.250730 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:17 crc kubenswrapper[4725]: E0227 06:12:17.250908 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:17 crc kubenswrapper[4725]: E0227 06:12:17.251120 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:17 crc kubenswrapper[4725]: E0227 06:12:17.251320 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:17 crc kubenswrapper[4725]: E0227 06:12:17.251502 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.345125 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.345167 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.345182 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.345201 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.345212 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.430838 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:17 crc kubenswrapper[4725]: E0227 06:12:17.431092 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:17 crc kubenswrapper[4725]: E0227 06:12:17.431214 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:19.431185797 +0000 UTC m=+117.893806406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.448333 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.448382 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.448399 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.448423 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.448442 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.550836 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.550905 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.550929 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.550962 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.550984 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.654719 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.654813 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.654841 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.654876 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.654900 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.758360 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.759081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.759238 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.759418 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.759548 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.862616 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.862673 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.862690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.862714 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.862730 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.966375 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.966431 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.966448 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.966474 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:17 crc kubenswrapper[4725]: I0227 06:12:17.966491 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:17Z","lastTransitionTime":"2026-02-27T06:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.069779 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.069919 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.069939 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.069965 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.069986 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.173260 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.173351 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.173370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.173401 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.173419 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.277914 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.278993 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.279004 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.279022 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.279034 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.381275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.381385 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.381405 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.381431 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.381450 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.484396 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.484478 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.484507 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.484535 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.484558 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.587974 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.588040 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.588057 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.588082 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.588103 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.691624 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.691698 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.691720 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.691749 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.691769 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.794490 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.794559 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.794576 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.794601 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.794618 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.899137 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.899203 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.899222 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.899247 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:18 crc kubenswrapper[4725]: I0227 06:12:18.899272 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:18Z","lastTransitionTime":"2026-02-27T06:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.002535 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.002597 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.002616 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.002640 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.002663 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.106588 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.106666 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.106690 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.106724 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.106745 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.210952 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.211043 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.211072 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.211109 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.211134 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.250706 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.250770 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.250838 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.250945 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:19 crc kubenswrapper[4725]: E0227 06:12:19.250953 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:19 crc kubenswrapper[4725]: E0227 06:12:19.251106 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:19 crc kubenswrapper[4725]: E0227 06:12:19.251188 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:19 crc kubenswrapper[4725]: E0227 06:12:19.251242 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.314775 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.314833 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.314845 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.314865 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.314879 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.418103 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.418177 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.418201 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.418232 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.418255 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.452607 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:19 crc kubenswrapper[4725]: E0227 06:12:19.452839 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:19 crc kubenswrapper[4725]: E0227 06:12:19.452966 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:23.452936062 +0000 UTC m=+121.915556721 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.523994 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.524061 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.524079 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.524105 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.524122 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.627890 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.627959 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.627977 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.628002 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.628020 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.731981 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.732048 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.732068 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.732094 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.732115 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.835508 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.835585 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.835609 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.835643 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.835668 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.939353 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.939419 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.939442 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.939484 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:19 crc kubenswrapper[4725]: I0227 06:12:19.939505 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:19Z","lastTransitionTime":"2026-02-27T06:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.043164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.043218 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.043235 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.043259 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.043276 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.146205 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.146268 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.146323 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.146354 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.146371 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.248531 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.248582 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.248602 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.248625 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.248642 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.351957 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.352029 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.352047 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.352073 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.352090 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.388274 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.388352 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.388370 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.388473 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.388492 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: E0227 06:12:20.408629 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:20Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.415669 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.415740 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.415760 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.415787 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.415809 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: E0227 06:12:20.435728 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:20Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.441101 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.441176 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.441200 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.441231 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.441256 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: E0227 06:12:20.465386 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:20Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.470849 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.470908 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.470925 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.470949 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.470966 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: E0227 06:12:20.491038 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:20Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.495834 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.495880 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.495896 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.495919 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.495936 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: E0227 06:12:20.515708 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:20Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:20 crc kubenswrapper[4725]: E0227 06:12:20.515965 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.517816 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.517860 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.517877 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.517896 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.517914 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.620268 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.620347 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.620365 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.620386 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.620403 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.722942 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.722985 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.723003 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.723025 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.723042 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.826156 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.826233 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.826258 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.826325 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.826343 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.929693 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.929769 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.929794 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.929827 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:20 crc kubenswrapper[4725]: I0227 06:12:20.929848 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:20Z","lastTransitionTime":"2026-02-27T06:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.032373 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.032423 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.032436 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.032454 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.032466 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.068734 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.068921 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:12:53.068892726 +0000 UTC m=+151.531513345 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.068977 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.069069 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.069140 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069324 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069393 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:53.06937321 +0000 UTC m=+151.531993819 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069530 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069551 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069565 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069574 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069602 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:53.069591086 +0000 UTC m=+151.532211665 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.069703 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:53.069677019 +0000 UTC m=+151.532297618 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.135389 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.135479 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.135498 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.135523 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.135542 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.170115 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.170377 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.170427 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.170448 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.170536 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:53.170510603 +0000 UTC m=+151.633131222 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.238573 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.238648 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.238665 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.238688 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.238705 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.250942 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.251021 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.251046 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.251101 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.250950 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.251242 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.251374 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:21 crc kubenswrapper[4725]: E0227 06:12:21.251597 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.341923 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.341952 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.341960 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.341972 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.341980 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.444903 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.444940 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.444950 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.444968 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.444981 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.548170 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.548225 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.548245 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.548270 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.548314 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.652246 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.652356 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.652372 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.652397 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.652414 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.755384 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.755433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.755445 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.755465 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.755478 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.858668 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.858728 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.858751 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.858775 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.858792 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.962381 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.962433 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.962449 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.962472 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:21 crc kubenswrapper[4725]: I0227 06:12:21.962490 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:21Z","lastTransitionTime":"2026-02-27T06:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.066198 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.066258 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.066275 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.066336 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.066356 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:22Z","lastTransitionTime":"2026-02-27T06:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.169836 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.169940 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.170012 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.170039 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.170057 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:22Z","lastTransitionTime":"2026-02-27T06:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:22 crc kubenswrapper[4725]: E0227 06:12:22.270282 4725 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.276544 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.296430 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.316160 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.334912 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.350694 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.376876 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.395666 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: E0227 06:12:22.404851 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.421744 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.456331 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.473959 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.505590 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.525096 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.540821 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.571783 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.589792 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.606419 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:22 crc kubenswrapper[4725]: I0227 06:12:22.622721 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:23 crc kubenswrapper[4725]: I0227 06:12:23.251429 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:23 crc kubenswrapper[4725]: I0227 06:12:23.251495 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:23 crc kubenswrapper[4725]: E0227 06:12:23.251920 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:23 crc kubenswrapper[4725]: I0227 06:12:23.251559 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:23 crc kubenswrapper[4725]: I0227 06:12:23.251519 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:23 crc kubenswrapper[4725]: E0227 06:12:23.252161 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:23 crc kubenswrapper[4725]: E0227 06:12:23.252436 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:23 crc kubenswrapper[4725]: E0227 06:12:23.252636 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:23 crc kubenswrapper[4725]: I0227 06:12:23.498225 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:23 crc kubenswrapper[4725]: E0227 06:12:23.498475 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:23 crc kubenswrapper[4725]: E0227 06:12:23.498624 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:31.498587094 +0000 UTC m=+129.961207753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:25 crc kubenswrapper[4725]: I0227 06:12:25.251240 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:25 crc kubenswrapper[4725]: I0227 06:12:25.251281 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:25 crc kubenswrapper[4725]: I0227 06:12:25.251240 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:25 crc kubenswrapper[4725]: I0227 06:12:25.251412 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:25 crc kubenswrapper[4725]: E0227 06:12:25.251460 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:25 crc kubenswrapper[4725]: E0227 06:12:25.251590 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:25 crc kubenswrapper[4725]: E0227 06:12:25.251786 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:25 crc kubenswrapper[4725]: E0227 06:12:25.251880 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.251513 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.251562 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.251654 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.251666 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:27 crc kubenswrapper[4725]: E0227 06:12:27.251799 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:27 crc kubenswrapper[4725]: E0227 06:12:27.251875 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:27 crc kubenswrapper[4725]: E0227 06:12:27.252029 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:27 crc kubenswrapper[4725]: E0227 06:12:27.252142 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.253055 4725 scope.go:117] "RemoveContainer" containerID="da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7" Feb 27 06:12:27 crc kubenswrapper[4725]: E0227 06:12:27.405912 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.987950 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/1.log" Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.991259 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7"} Feb 27 06:12:27 crc kubenswrapper[4725]: I0227 06:12:27.991939 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.010379 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.024607 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.037744 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.049956 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.062961 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.073267 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.084355 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.096957 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.116627 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.129519 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.141994 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.157719 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.203695 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.229010 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.242981 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.253681 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:28 crc kubenswrapper[4725]: I0227 06:12:28.263925 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:28Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:29 crc kubenswrapper[4725]: I0227 06:12:29.193588 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:29 crc kubenswrapper[4725]: E0227 06:12:29.195597 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:29 crc kubenswrapper[4725]: I0227 06:12:29.195680 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:29 crc kubenswrapper[4725]: I0227 06:12:29.195751 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:29 crc kubenswrapper[4725]: I0227 06:12:29.195795 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:29 crc kubenswrapper[4725]: E0227 06:12:29.195868 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:29 crc kubenswrapper[4725]: E0227 06:12:29.196023 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:29 crc kubenswrapper[4725]: E0227 06:12:29.196082 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.203840 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/2.log" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.205082 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/1.log" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.210013 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7" exitCode=1 Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.210074 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7"} Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.210128 4725 scope.go:117] "RemoveContainer" containerID="da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.211652 4725 scope.go:117] "RemoveContainer" containerID="a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7" Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.212123 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.232697 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.250580 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.250762 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.255568 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.277017 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.292986 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.325882 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.341398 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.353667 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.367010 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.384797 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.407364 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.429142 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.450121 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.467560 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.481993 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.496605 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.517950 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.550738 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.812040 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.812099 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.812117 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.812141 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.812158 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:30Z","lastTransitionTime":"2026-02-27T06:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.834173 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.840111 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.840169 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.840186 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.840209 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.840225 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:30Z","lastTransitionTime":"2026-02-27T06:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.862153 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.868271 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.868374 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.868402 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.868473 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.868500 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:30Z","lastTransitionTime":"2026-02-27T06:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.891032 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.897721 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.897798 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.897825 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.897863 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.897888 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:30Z","lastTransitionTime":"2026-02-27T06:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.920837 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.926083 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.926144 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.926171 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.926201 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:30 crc kubenswrapper[4725]: I0227 06:12:30.926226 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:30Z","lastTransitionTime":"2026-02-27T06:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.946771 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:30Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:30 crc kubenswrapper[4725]: E0227 06:12:30.947020 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:12:31 crc kubenswrapper[4725]: I0227 06:12:31.218540 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/2.log" Feb 27 06:12:31 crc kubenswrapper[4725]: I0227 06:12:31.251041 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:31 crc kubenswrapper[4725]: I0227 06:12:31.251125 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:31 crc kubenswrapper[4725]: I0227 06:12:31.251165 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:31 crc kubenswrapper[4725]: E0227 06:12:31.251268 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:31 crc kubenswrapper[4725]: E0227 06:12:31.251471 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:31 crc kubenswrapper[4725]: E0227 06:12:31.251707 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:31 crc kubenswrapper[4725]: I0227 06:12:31.594683 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:31 crc kubenswrapper[4725]: E0227 06:12:31.594918 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:31 crc kubenswrapper[4725]: E0227 06:12:31.595041 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:12:47.595012681 +0000 UTC m=+146.057633290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.250990 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:32 crc kubenswrapper[4725]: E0227 06:12:32.251668 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.269222 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.286490 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.316880 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.336872 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.351580 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.371218 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.388225 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: E0227 06:12:32.406958 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.423396 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.442580 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.461444 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.478340 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.502705 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.525594 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.545692 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.562753 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.576557 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:32 crc kubenswrapper[4725]: I0227 06:12:32.590451 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:32Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:33 crc kubenswrapper[4725]: I0227 06:12:33.251018 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:33 crc kubenswrapper[4725]: I0227 06:12:33.251087 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:33 crc kubenswrapper[4725]: I0227 06:12:33.251157 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:33 crc kubenswrapper[4725]: E0227 06:12:33.251361 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:33 crc kubenswrapper[4725]: E0227 06:12:33.251518 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:33 crc kubenswrapper[4725]: E0227 06:12:33.251887 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:34 crc kubenswrapper[4725]: I0227 06:12:34.251238 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:34 crc kubenswrapper[4725]: E0227 06:12:34.251489 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:35 crc kubenswrapper[4725]: I0227 06:12:35.250810 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:35 crc kubenswrapper[4725]: I0227 06:12:35.250855 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:35 crc kubenswrapper[4725]: I0227 06:12:35.250948 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:35 crc kubenswrapper[4725]: E0227 06:12:35.251080 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:35 crc kubenswrapper[4725]: E0227 06:12:35.251231 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:35 crc kubenswrapper[4725]: E0227 06:12:35.251444 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:36 crc kubenswrapper[4725]: I0227 06:12:36.250914 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:36 crc kubenswrapper[4725]: E0227 06:12:36.251092 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:37 crc kubenswrapper[4725]: I0227 06:12:37.251172 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:37 crc kubenswrapper[4725]: I0227 06:12:37.251172 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:37 crc kubenswrapper[4725]: E0227 06:12:37.251369 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:37 crc kubenswrapper[4725]: I0227 06:12:37.251196 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:37 crc kubenswrapper[4725]: E0227 06:12:37.251576 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:37 crc kubenswrapper[4725]: E0227 06:12:37.251875 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:37 crc kubenswrapper[4725]: E0227 06:12:37.408924 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:38 crc kubenswrapper[4725]: I0227 06:12:38.250606 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:38 crc kubenswrapper[4725]: E0227 06:12:38.250776 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:39 crc kubenswrapper[4725]: I0227 06:12:39.250739 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:39 crc kubenswrapper[4725]: E0227 06:12:39.250916 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:39 crc kubenswrapper[4725]: I0227 06:12:39.250943 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:39 crc kubenswrapper[4725]: I0227 06:12:39.251018 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:39 crc kubenswrapper[4725]: E0227 06:12:39.251149 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:39 crc kubenswrapper[4725]: E0227 06:12:39.251319 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:40 crc kubenswrapper[4725]: I0227 06:12:40.251420 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:40 crc kubenswrapper[4725]: E0227 06:12:40.252280 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.032466 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.032585 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.032614 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.032644 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.032667 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:41Z","lastTransitionTime":"2026-02-27T06:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.052912 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:41Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.058009 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.058056 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.058074 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.058098 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.058118 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:41Z","lastTransitionTime":"2026-02-27T06:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.078712 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:41Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.083996 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.084047 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.084064 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.084086 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.084106 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:41Z","lastTransitionTime":"2026-02-27T06:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.105188 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:41Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.111115 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.111189 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.111211 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.111244 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.111269 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:41Z","lastTransitionTime":"2026-02-27T06:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.127162 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:41Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.132057 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.132115 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.132136 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.132164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.132181 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:41Z","lastTransitionTime":"2026-02-27T06:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.152531 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:41Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.152755 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.251215 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.251349 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:41 crc kubenswrapper[4725]: I0227 06:12:41.251253 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.251482 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.251592 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:41 crc kubenswrapper[4725]: E0227 06:12:41.251765 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.251721 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:42 crc kubenswrapper[4725]: E0227 06:12:42.252376 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.252927 4725 scope.go:117] "RemoveContainer" containerID="a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7" Feb 27 06:12:42 crc kubenswrapper[4725]: E0227 06:12:42.253378 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.273624 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.295031 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.317584 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.349475 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.385947 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.411007 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: E0227 06:12:42.418843 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.432490 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.452260 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.473830 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.493988 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.510968 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.529659 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.544912 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.566896 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.582837 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.608122 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.640831 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da349960364e84335cbdf87e805172325db1da3834f75d7aea6595576bc287d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:13Z\\\",\\\"message\\\":\\\"nd:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0227 06:12:13.877364 6693 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0227 06:12:13.877333 6693 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.661196 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.683124 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.708513 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.735713 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.811780 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.834518 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.846564 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.865908 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.880501 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.897431 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.913659 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.930863 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.944989 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.964069 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:42 crc kubenswrapper[4725]: I0227 06:12:42.980105 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:43 crc kubenswrapper[4725]: I0227 06:12:43.002128 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:42Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:43 crc kubenswrapper[4725]: I0227 06:12:43.026445 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:43Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:43 crc kubenswrapper[4725]: I0227 06:12:43.250690 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:43 crc kubenswrapper[4725]: I0227 06:12:43.250923 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:43 crc kubenswrapper[4725]: E0227 06:12:43.251225 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:43 crc kubenswrapper[4725]: I0227 06:12:43.251649 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:43 crc kubenswrapper[4725]: E0227 06:12:43.251908 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:43 crc kubenswrapper[4725]: E0227 06:12:43.252463 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:43 crc kubenswrapper[4725]: I0227 06:12:43.267428 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 27 06:12:43 crc kubenswrapper[4725]: I0227 06:12:43.267650 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 27 06:12:44 crc kubenswrapper[4725]: I0227 06:12:44.250971 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:44 crc kubenswrapper[4725]: E0227 06:12:44.251175 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:45 crc kubenswrapper[4725]: I0227 06:12:45.251559 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:45 crc kubenswrapper[4725]: I0227 06:12:45.251609 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:45 crc kubenswrapper[4725]: I0227 06:12:45.251741 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:45 crc kubenswrapper[4725]: E0227 06:12:45.251770 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:45 crc kubenswrapper[4725]: E0227 06:12:45.251981 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:45 crc kubenswrapper[4725]: E0227 06:12:45.252262 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:46 crc kubenswrapper[4725]: I0227 06:12:46.251508 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:46 crc kubenswrapper[4725]: E0227 06:12:46.251962 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:47 crc kubenswrapper[4725]: I0227 06:12:47.251257 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:47 crc kubenswrapper[4725]: I0227 06:12:47.251337 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:47 crc kubenswrapper[4725]: I0227 06:12:47.251368 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:47 crc kubenswrapper[4725]: E0227 06:12:47.251512 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:47 crc kubenswrapper[4725]: E0227 06:12:47.251595 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:47 crc kubenswrapper[4725]: E0227 06:12:47.251676 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:47 crc kubenswrapper[4725]: E0227 06:12:47.420002 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:47 crc kubenswrapper[4725]: I0227 06:12:47.679324 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:47 crc kubenswrapper[4725]: E0227 06:12:47.679541 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:47 crc kubenswrapper[4725]: E0227 06:12:47.679621 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:13:19.679599774 +0000 UTC m=+178.142220383 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:12:48 crc kubenswrapper[4725]: I0227 06:12:48.250865 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:48 crc kubenswrapper[4725]: E0227 06:12:48.251045 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.250503 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.250587 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.250642 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:49 crc kubenswrapper[4725]: E0227 06:12:49.250801 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:49 crc kubenswrapper[4725]: E0227 06:12:49.250943 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:49 crc kubenswrapper[4725]: E0227 06:12:49.251076 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.302424 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/0.log" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.302526 4725 generic.go:334] "Generic (PLEG): container finished" podID="7439e599-9b13-45e6-8f71-ef3760b2235b" containerID="41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7" exitCode=1 Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.302625 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerDied","Data":"41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7"} Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.303400 4725 scope.go:117] "RemoveContainer" containerID="41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.324044 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.342257 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.380132 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.397061 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.417934 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.437407 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.453419 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.469692 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.493168 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.514880 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.537663 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.562317 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.575994 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.594433 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc8155d5-fe14-428d-b777-2dbf5d496412\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.614250 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.633618 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"2026-02-27T06:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb\\\\n2026-02-27T06:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb to /host/opt/cni/bin/\\\\n2026-02-27T06:12:04Z [verbose] multus-daemon started\\\\n2026-02-27T06:12:04Z [verbose] Readiness Indicator file check\\\\n2026-02-27T06:12:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.649169 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.670001 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab1fcb5-e7af-4801-afbb-2d004cc23a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:10:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 06:10:24.596445 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 06:10:24.599341 1 observer_polling.go:159] Starting file observer\\\\nI0227 06:10:24.663616 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 06:10:24.667434 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 06:10:53.749902 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 06:10:53.750156 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:49 crc kubenswrapper[4725]: I0227 06:12:49.691095 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:49Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.251616 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:50 crc kubenswrapper[4725]: E0227 06:12:50.251815 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.309425 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/0.log" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.309517 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerStarted","Data":"275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd"} Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.329475 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.345907 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.372809 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.396439 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.420619 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.444086 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.474989 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.492976 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.510840 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc8155d5-fe14-428d-b777-2dbf5d496412\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.531340 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.551887 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"2026-02-27T06:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb\\\\n2026-02-27T06:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb to /host/opt/cni/bin/\\\\n2026-02-27T06:12:04Z [verbose] multus-daemon started\\\\n2026-02-27T06:12:04Z [verbose] Readiness Indicator file check\\\\n2026-02-27T06:12:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.567467 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.587694 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab1fcb5-e7af-4801-afbb-2d004cc23a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:10:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 06:10:24.596445 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 06:10:24.599341 1 observer_polling.go:159] Starting file observer\\\\nI0227 06:10:24.663616 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 06:10:24.667434 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 06:10:53.749902 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 06:10:53.750156 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.607868 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.626869 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.644030 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.676083 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.695257 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:50 crc kubenswrapper[4725]: I0227 06:12:50.712210 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:50Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.251081 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.251179 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.251094 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.251265 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.251412 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.251787 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.335633 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.335698 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.335717 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.335743 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.335761 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:51Z","lastTransitionTime":"2026-02-27T06:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.360171 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:51Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.365734 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.365793 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.365811 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.365836 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.365854 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:51Z","lastTransitionTime":"2026-02-27T06:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.387497 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:51Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.392116 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.392413 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.392648 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.392812 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.392952 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:51Z","lastTransitionTime":"2026-02-27T06:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.416253 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:51Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.421034 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.421087 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.421104 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.421127 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.421144 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:51Z","lastTransitionTime":"2026-02-27T06:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.443235 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:51Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.448177 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.448410 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.448571 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.448728 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:12:51 crc kubenswrapper[4725]: I0227 06:12:51.448870 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:12:51Z","lastTransitionTime":"2026-02-27T06:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.469805 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:51Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:51 crc kubenswrapper[4725]: E0227 06:12:51.470031 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.250795 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:52 crc kubenswrapper[4725]: E0227 06:12:52.250971 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.268416 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.294392 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.316338 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.336729 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.356878 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.376057 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.392001 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.410562 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc8155d5-fe14-428d-b777-2dbf5d496412\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: E0227 06:12:52.420885 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.435094 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.464550 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.479479 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.497385 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab1fcb5-e7af-4801-afbb-2d004cc23a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:10:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 06:10:24.596445 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 06:10:24.599341 1 observer_polling.go:159] Starting file observer\\\\nI0227 06:10:24.663616 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 06:10:24.667434 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 06:10:53.749902 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 06:10:53.750156 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.515892 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.534669 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.556101 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"2026-02-27T06:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb\\\\n2026-02-27T06:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb to /host/opt/cni/bin/\\\\n2026-02-27T06:12:04Z [verbose] multus-daemon started\\\\n2026-02-27T06:12:04Z [verbose] Readiness Indicator file check\\\\n2026-02-27T06:12:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.589790 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.609262 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.632065 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:52 crc kubenswrapper[4725]: I0227 06:12:52.649819 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:52Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.136721 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.136826 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.136878 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.136849612 +0000 UTC m=+215.599470171 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.136925 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.136935 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.136950 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.136961 4725 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.137007 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.136993566 +0000 UTC m=+215.599614135 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.137024 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.137096 4725 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.137121 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.137115119 +0000 UTC m=+215.599735688 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.137134 4725 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.137248 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.137220832 +0000 UTC m=+215.599841431 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.238410 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.238709 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.238772 4725 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.238800 4725 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.238920 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.238882684 +0000 UTC m=+215.701503313 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.251431 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.251576 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.251648 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:53 crc kubenswrapper[4725]: I0227 06:12:53.251667 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.251816 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:53 crc kubenswrapper[4725]: E0227 06:12:53.252087 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:54 crc kubenswrapper[4725]: I0227 06:12:54.251134 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:54 crc kubenswrapper[4725]: E0227 06:12:54.251379 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:55 crc kubenswrapper[4725]: I0227 06:12:55.250763 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:55 crc kubenswrapper[4725]: I0227 06:12:55.250869 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:55 crc kubenswrapper[4725]: I0227 06:12:55.250790 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:55 crc kubenswrapper[4725]: E0227 06:12:55.251012 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:55 crc kubenswrapper[4725]: E0227 06:12:55.251130 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:55 crc kubenswrapper[4725]: E0227 06:12:55.251340 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:56 crc kubenswrapper[4725]: I0227 06:12:56.250514 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:56 crc kubenswrapper[4725]: E0227 06:12:56.250764 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:56 crc kubenswrapper[4725]: I0227 06:12:56.252052 4725 scope.go:117] "RemoveContainer" containerID="a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.250484 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.250580 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:57 crc kubenswrapper[4725]: E0227 06:12:57.250650 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.250580 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:57 crc kubenswrapper[4725]: E0227 06:12:57.250762 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:57 crc kubenswrapper[4725]: E0227 06:12:57.250859 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.337713 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/3.log" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.338902 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/2.log" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.342607 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" exitCode=1 Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.342685 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.342734 4725 scope.go:117] "RemoveContainer" containerID="a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.343871 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:12:57 crc kubenswrapper[4725]: E0227 06:12:57.344114 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.367412 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.385740 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.401707 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.418996 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: E0227 06:12:57.421814 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.432824 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.455223 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.470380 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.487485 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc8155d5-fe14-428d-b777-2dbf5d496412\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.508611 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.539071 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:57Z\\\",\\\"message\\\":\\\"shift-machine-api/machine-api-operator-machine-webhook-z5vz6 as it is not a known egress service\\\\nI0227 06:12:57.304566 7263 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304528 7263 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304852 7263 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304444 7263 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:57.305051 7263 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:57.305087 7263 factory.go:656] Stopping watch factory\\\\nI0227 06:12:57.305115 7263 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:57.304568 7263 egressservice_zone_endpointslice.go:80] Ignoring updating openshift-machine-config-operator/machine-config-operator for endpointslice openshift-machine-config-operator/machine-config-operator-g8487 as it is not a known egress service\\\\nI0227 06:12:57.305321 7263 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:57.305654 7263 ovnkube.go:599] Stopped ovnkube\\\\nI0227 06:12:57.305713 7263 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0227 06:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.559448 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab1fcb5-e7af-4801-afbb-2d004cc23a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:10:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 06:10:24.596445 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 06:10:24.599341 1 observer_polling.go:159] Starting file observer\\\\nI0227 06:10:24.663616 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 06:10:24.667434 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 06:10:53.749902 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 06:10:53.750156 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.578204 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.594274 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.612841 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"2026-02-27T06:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb\\\\n2026-02-27T06:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb to /host/opt/cni/bin/\\\\n2026-02-27T06:12:04Z [verbose] multus-daemon started\\\\n2026-02-27T06:12:04Z [verbose] Readiness Indicator file check\\\\n2026-02-27T06:12:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.629404 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.662174 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.682005 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.697974 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:57 crc kubenswrapper[4725]: I0227 06:12:57.715724 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:12:57Z is after 2025-08-24T17:21:41Z" Feb 27 06:12:58 crc kubenswrapper[4725]: I0227 06:12:58.250808 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:12:58 crc kubenswrapper[4725]: E0227 06:12:58.251004 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:12:58 crc kubenswrapper[4725]: I0227 06:12:58.354324 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/3.log" Feb 27 06:12:59 crc kubenswrapper[4725]: I0227 06:12:59.251453 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:12:59 crc kubenswrapper[4725]: E0227 06:12:59.251649 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:12:59 crc kubenswrapper[4725]: I0227 06:12:59.251673 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:12:59 crc kubenswrapper[4725]: I0227 06:12:59.251687 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:12:59 crc kubenswrapper[4725]: E0227 06:12:59.251962 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:12:59 crc kubenswrapper[4725]: E0227 06:12:59.252224 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:00 crc kubenswrapper[4725]: I0227 06:13:00.251586 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:00 crc kubenswrapper[4725]: E0227 06:13:00.251831 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.251226 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.251264 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.251334 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.251507 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.251576 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.251728 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.712533 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.712615 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.712637 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.712666 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.712687 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:01Z","lastTransitionTime":"2026-02-27T06:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.735333 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:01Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.742164 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.742220 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.742237 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.742265 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.742312 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:01Z","lastTransitionTime":"2026-02-27T06:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.763243 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:01Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.768553 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.768620 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.768639 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.768672 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.768692 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:01Z","lastTransitionTime":"2026-02-27T06:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.789538 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:01Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.794228 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.794282 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.794334 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.794362 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.794454 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:01Z","lastTransitionTime":"2026-02-27T06:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.817059 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:01Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.822317 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.822372 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.822393 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.822419 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:01 crc kubenswrapper[4725]: I0227 06:13:01.822436 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:01Z","lastTransitionTime":"2026-02-27T06:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.842942 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:01Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:01 crc kubenswrapper[4725]: E0227 06:13:01.843138 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.250917 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:02 crc kubenswrapper[4725]: E0227 06:13:02.251190 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.277848 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.295481 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.312666 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.329085 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.351931 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.368698 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.389892 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.418017 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: E0227 06:13:02.429563 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.434472 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.461280 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.477098 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.496424 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc8155d5-fe14-428d-b777-2dbf5d496412\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.517001 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.547019 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0c4077d371f6a18de7baba9daf56b909a3eb715e1d2585bea6f0091bea380a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:29Z\\\",\\\"message\\\":\\\"9 in node crc\\\\nI0227 06:12:28.206272 6942 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-lchm9 after 0 failed attempt(s)\\\\nI0227 06:12:28.206282 6942 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-lchm9\\\\nI0227 06:12:28.206223 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-g8jqm\\\\nI0227 06:12:28.206346 6942 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206359 6942 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0227 06:12:28.206362 6942 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0227 06:12:28.206370 6942 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0227 06:12:28.206399 6942 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0227 06:12:28.206437 6942 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:57Z\\\",\\\"message\\\":\\\"shift-machine-api/machine-api-operator-machine-webhook-z5vz6 as it is not a known egress service\\\\nI0227 06:12:57.304566 7263 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304528 7263 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304852 7263 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304444 7263 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:57.305051 7263 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:57.305087 7263 factory.go:656] Stopping watch factory\\\\nI0227 06:12:57.305115 7263 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:57.304568 7263 egressservice_zone_endpointslice.go:80] Ignoring updating openshift-machine-config-operator/machine-config-operator for endpointslice openshift-machine-config-operator/machine-config-operator-g8487 as it is not a known egress service\\\\nI0227 06:12:57.305321 7263 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:57.305654 7263 ovnkube.go:599] Stopped ovnkube\\\\nI0227 06:12:57.305713 7263 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0227 06:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.568668 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab1fcb5-e7af-4801-afbb-2d004cc23a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:10:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 06:10:24.596445 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 06:10:24.599341 1 observer_polling.go:159] Starting file observer\\\\nI0227 06:10:24.663616 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 06:10:24.667434 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 06:10:53.749902 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 06:10:53.750156 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.589950 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.611627 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.631597 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"2026-02-27T06:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb\\\\n2026-02-27T06:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb to /host/opt/cni/bin/\\\\n2026-02-27T06:12:04Z [verbose] multus-daemon started\\\\n2026-02-27T06:12:04Z [verbose] Readiness Indicator file check\\\\n2026-02-27T06:12:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.643881 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:02Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.979944 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:13:02 crc kubenswrapper[4725]: I0227 06:13:02.981864 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:13:02 crc kubenswrapper[4725]: E0227 06:13:02.982269 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.007655 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab1fcb5-e7af-4801-afbb-2d004cc23a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:10:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 06:10:24.596445 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 06:10:24.599341 1 observer_polling.go:159] Starting file observer\\\\nI0227 06:10:24.663616 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 06:10:24.667434 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 06:10:53.749902 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 06:10:53.750156 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.028432 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.043911 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.061399 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"2026-02-27T06:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb\\\\n2026-02-27T06:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb to /host/opt/cni/bin/\\\\n2026-02-27T06:12:04Z [verbose] multus-daemon started\\\\n2026-02-27T06:12:04Z [verbose] Readiness Indicator file check\\\\n2026-02-27T06:12:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.077271 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.103339 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.122020 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.139321 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.159558 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.177363 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.201536 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.220547 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.240436 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.251433 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.251442 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.251593 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:03 crc kubenswrapper[4725]: E0227 06:13:03.251824 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:03 crc kubenswrapper[4725]: E0227 06:13:03.252147 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:03 crc kubenswrapper[4725]: E0227 06:13:03.252259 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.260966 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.290755 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.303935 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.319589 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc8155d5-fe14-428d-b777-2dbf5d496412\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.336998 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:03 crc kubenswrapper[4725]: I0227 06:13:03.368931 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:57Z\\\",\\\"message\\\":\\\"shift-machine-api/machine-api-operator-machine-webhook-z5vz6 as it is not a known egress service\\\\nI0227 06:12:57.304566 7263 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304528 7263 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304852 7263 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304444 7263 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:57.305051 7263 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:57.305087 7263 factory.go:656] Stopping watch factory\\\\nI0227 06:12:57.305115 7263 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:57.304568 7263 egressservice_zone_endpointslice.go:80] Ignoring updating openshift-machine-config-operator/machine-config-operator for endpointslice openshift-machine-config-operator/machine-config-operator-g8487 as it is not a known egress service\\\\nI0227 06:12:57.305321 7263 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:57.305654 7263 ovnkube.go:599] Stopped ovnkube\\\\nI0227 06:12:57.305713 7263 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0227 06:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:03Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:04 crc kubenswrapper[4725]: I0227 06:13:04.251748 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:04 crc kubenswrapper[4725]: E0227 06:13:04.252018 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:05 crc kubenswrapper[4725]: I0227 06:13:05.250724 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:05 crc kubenswrapper[4725]: I0227 06:13:05.250727 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:05 crc kubenswrapper[4725]: E0227 06:13:05.250895 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:05 crc kubenswrapper[4725]: I0227 06:13:05.250949 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:05 crc kubenswrapper[4725]: E0227 06:13:05.251064 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:05 crc kubenswrapper[4725]: E0227 06:13:05.251330 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:06 crc kubenswrapper[4725]: I0227 06:13:06.250829 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:06 crc kubenswrapper[4725]: E0227 06:13:06.251565 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:07 crc kubenswrapper[4725]: I0227 06:13:07.250855 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:07 crc kubenswrapper[4725]: I0227 06:13:07.250895 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:07 crc kubenswrapper[4725]: I0227 06:13:07.250925 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:07 crc kubenswrapper[4725]: E0227 06:13:07.251068 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:07 crc kubenswrapper[4725]: E0227 06:13:07.251245 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:07 crc kubenswrapper[4725]: E0227 06:13:07.251523 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:07 crc kubenswrapper[4725]: E0227 06:13:07.431468 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:08 crc kubenswrapper[4725]: I0227 06:13:08.251655 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:08 crc kubenswrapper[4725]: E0227 06:13:08.251899 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:09 crc kubenswrapper[4725]: I0227 06:13:09.250858 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:09 crc kubenswrapper[4725]: I0227 06:13:09.250922 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:09 crc kubenswrapper[4725]: I0227 06:13:09.250870 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:09 crc kubenswrapper[4725]: E0227 06:13:09.251031 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:09 crc kubenswrapper[4725]: E0227 06:13:09.251204 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:09 crc kubenswrapper[4725]: E0227 06:13:09.251377 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:10 crc kubenswrapper[4725]: I0227 06:13:10.250696 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:10 crc kubenswrapper[4725]: E0227 06:13:10.250861 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:11 crc kubenswrapper[4725]: I0227 06:13:11.251177 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:11 crc kubenswrapper[4725]: E0227 06:13:11.251395 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:11 crc kubenswrapper[4725]: I0227 06:13:11.251458 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:11 crc kubenswrapper[4725]: I0227 06:13:11.251465 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:11 crc kubenswrapper[4725]: E0227 06:13:11.251961 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:11 crc kubenswrapper[4725]: E0227 06:13:11.252104 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.214532 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.214619 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.214653 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.214684 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.214705 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:12Z","lastTransitionTime":"2026-02-27T06:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.236048 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.240966 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.241037 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.241055 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.241081 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.241098 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:12Z","lastTransitionTime":"2026-02-27T06:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.251614 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.251796 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.263705 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.269350 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.269530 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.269675 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.269869 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.270034 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:12Z","lastTransitionTime":"2026-02-27T06:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.278599 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20acbe1e-5472-4b81-b830-d2cb9c19f564\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"message\\\":\\\"W0227 06:11:13.551180 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0227 06:11:13.551843 1 crypto.go:601] Generating new CA for check-endpoints-signer@1772172673 cert, and key in /tmp/serving-cert-563015182/serving-signer.crt, /tmp/serving-cert-563015182/serving-signer.key\\\\nI0227 06:11:14.031052 1 observer_polling.go:159] Starting file observer\\\\nW0227 06:11:14.043874 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0227 06:11:14.044096 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 06:11:14.045823 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-563015182/tls.crt::/tmp/serving-cert-563015182/tls.key\\\\\\\"\\\\nF0227 06:11:14.672987 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:11:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.293166 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.298878 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.298945 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.298964 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.298990 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.299007 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:12Z","lastTransitionTime":"2026-02-27T06:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.303153 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22e3e44e06c1a78906668e15e3467cb9b2173ed93396f7077d3b27334a961256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01e59c66aa831859c1d05d0f860267e300025518daa267a1f93c6ec19f7a4c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.315394 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.321200 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.321260 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.321279 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.321329 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.321349 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:12Z","lastTransitionTime":"2026-02-27T06:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.326457 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.342900 4725 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T06:13:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d597123e-4f5a-4643-94b9-026053817d04\\\",\\\"systemUUID\\\":\\\"8cd3fff4-1c99-4289-9cf4-2c947cb81dcd\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.343703 4725 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.346117 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.364031 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.390083 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gxjff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c170c9d-f4fe-48cb-b9f5-8c5e7d54ea8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eddd6a97b751a5e678fd0b9f793890ca1d8355a0db4a9cecee1821fccf2d78a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe9c61034ff5dce4b34d9bd490a7e7f5f4c2f98074440afd6d9c5db883991561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf42d54b1b954544f9b16d1c52f95325e53d1b3123deda6c13a5f0b106cf9541\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be1605d7b87ab9b9d7da6f475f3b13de3e91a5094ec1422b704f79e407aa23d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8aaedb6e81591cde5de2d043d116c3e08d458006f17838405953142b6666745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e537d492a3f8542f28470c0de5c6e1ad52be59fba37c08d9104f585d50c736f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981c51509a8dba6c453b0b492db609a7966cd1d7c93dbd39490cc6f29ca88e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rswtk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gxjff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.409492 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f0f5d69-629e-4fa1-b8d9-a784228154f6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eb822ea857ba33de084c86521bb9e285b3121f7afbae89109d9a5a3d2e9f20a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bec3db3543b18d62b68c06981a9d33aac1a5035ae4e9ba7ae44a952aa253b328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: E0227 06:13:12.432277 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.434004 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc8155d5-fe14-428d-b777-2dbf5d496412\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4595e58ecad1a5b050dcf2e878910911a56a9f89d63465a5f325c98aa251f18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dadde0c349b10fff90f10b92c9b5281efad9255532e577e401bbaba616e4a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://751966e18a90fcdc6dd2d4d56723a2e2844258df22a6ed28820ff07a1daa747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6752d0f4ad04fc92c277a8cb2e701d70843406c79b1d289c857372d280689bd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.457403 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c174529f26892e72ec8315a6802c98550c85d34d562446405aa2a190290fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.491566 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05a446dc-e501-4173-a911-7b33ca4608c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:57Z\\\",\\\"message\\\":\\\"shift-machine-api/machine-api-operator-machine-webhook-z5vz6 as it is not a known egress service\\\\nI0227 06:12:57.304566 7263 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304528 7263 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304852 7263 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 06:12:57.304444 7263 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 06:12:57.305051 7263 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 06:12:57.305087 7263 factory.go:656] Stopping watch factory\\\\nI0227 06:12:57.305115 7263 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 06:12:57.304568 7263 egressservice_zone_endpointslice.go:80] Ignoring updating openshift-machine-config-operator/machine-config-operator for endpointslice openshift-machine-config-operator/machine-config-operator-g8487 as it is not a known egress service\\\\nI0227 06:12:57.305321 7263 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 06:12:57.305654 7263 ovnkube.go:599] Stopped ovnkube\\\\nI0227 06:12:57.305713 7263 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0227 06:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbt4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lchm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.515258 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab1fcb5-e7af-4801-afbb-2d004cc23a67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79cd0f96a2ff39455b6d235de96030ddaca759e00584bc23e6888b40ac70983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56a7813706a3657e0df39c993cac37ce207456cf0a29d37220d8a1e5e085aec5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T06:10:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 06:10:24.596445 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 06:10:24.599341 1 observer_polling.go:159] Starting file observer\\\\nI0227 06:10:24.663616 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 06:10:24.667434 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 06:10:53.749902 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 06:10:53.750156 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:10:52Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d03bcbda04f806ba411f8445c46ab18f5fc0ff52ced98c092f88dbb8e960adbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21110adb8169c7b16d732637ca093685ce98a5e9f8ce19f128f14cddd5239cfb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.536767 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.558878 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.582194 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-g8jqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7439e599-9b13-45e6-8f71-ef3760b2235b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T06:12:49Z\\\",\\\"message\\\":\\\"2026-02-27T06:12:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb\\\\n2026-02-27T06:12:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1463b4e7-4e55-4c15-b031-5af8f9a0dfeb to /host/opt/cni/bin/\\\\n2026-02-27T06:12:04Z [verbose] multus-daemon started\\\\n2026-02-27T06:12:04Z [verbose] Readiness Indicator file check\\\\n2026-02-27T06:12:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhqtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-g8jqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.601144 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42fz8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vcl2g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.643608 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.662646 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.680093 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:12 crc kubenswrapper[4725]: I0227 06:13:12.698121 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:12Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:13 crc kubenswrapper[4725]: I0227 06:13:13.250844 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:13 crc kubenswrapper[4725]: I0227 06:13:13.250972 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:13 crc kubenswrapper[4725]: E0227 06:13:13.251121 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:13 crc kubenswrapper[4725]: I0227 06:13:13.250867 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:13 crc kubenswrapper[4725]: E0227 06:13:13.251273 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:13 crc kubenswrapper[4725]: E0227 06:13:13.251437 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:14 crc kubenswrapper[4725]: I0227 06:13:14.251680 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:14 crc kubenswrapper[4725]: E0227 06:13:14.252368 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:14 crc kubenswrapper[4725]: I0227 06:13:14.252789 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:13:14 crc kubenswrapper[4725]: E0227 06:13:14.253257 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:13:15 crc kubenswrapper[4725]: I0227 06:13:15.251524 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:15 crc kubenswrapper[4725]: I0227 06:13:15.251565 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:15 crc kubenswrapper[4725]: I0227 06:13:15.251537 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:15 crc kubenswrapper[4725]: E0227 06:13:15.251720 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:15 crc kubenswrapper[4725]: E0227 06:13:15.251831 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:15 crc kubenswrapper[4725]: E0227 06:13:15.251945 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:16 crc kubenswrapper[4725]: I0227 06:13:16.251028 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:16 crc kubenswrapper[4725]: E0227 06:13:16.251206 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:17 crc kubenswrapper[4725]: I0227 06:13:17.254497 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:17 crc kubenswrapper[4725]: E0227 06:13:17.256199 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:17 crc kubenswrapper[4725]: I0227 06:13:17.255015 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:17 crc kubenswrapper[4725]: E0227 06:13:17.256744 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:17 crc kubenswrapper[4725]: I0227 06:13:17.256917 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:17 crc kubenswrapper[4725]: E0227 06:13:17.257025 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:17 crc kubenswrapper[4725]: E0227 06:13:17.433560 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:18 crc kubenswrapper[4725]: I0227 06:13:18.250714 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:18 crc kubenswrapper[4725]: E0227 06:13:18.251185 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:19 crc kubenswrapper[4725]: I0227 06:13:19.251222 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:19 crc kubenswrapper[4725]: I0227 06:13:19.251308 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:19 crc kubenswrapper[4725]: I0227 06:13:19.251234 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:19 crc kubenswrapper[4725]: E0227 06:13:19.251485 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:19 crc kubenswrapper[4725]: E0227 06:13:19.251626 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:19 crc kubenswrapper[4725]: E0227 06:13:19.251926 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:19 crc kubenswrapper[4725]: I0227 06:13:19.749423 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:19 crc kubenswrapper[4725]: E0227 06:13:19.749744 4725 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:13:19 crc kubenswrapper[4725]: E0227 06:13:19.749872 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs podName:ea1b7fec-c4c1-4ae5-a74a-8396d6428900 nodeName:}" failed. No retries permitted until 2026-02-27 06:14:23.749840116 +0000 UTC m=+242.212460785 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs") pod "network-metrics-daemon-vcl2g" (UID: "ea1b7fec-c4c1-4ae5-a74a-8396d6428900") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 06:13:20 crc kubenswrapper[4725]: I0227 06:13:20.250639 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:20 crc kubenswrapper[4725]: E0227 06:13:20.251345 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:21 crc kubenswrapper[4725]: I0227 06:13:21.250505 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:21 crc kubenswrapper[4725]: I0227 06:13:21.250532 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:21 crc kubenswrapper[4725]: E0227 06:13:21.250974 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:21 crc kubenswrapper[4725]: I0227 06:13:21.250639 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:21 crc kubenswrapper[4725]: E0227 06:13:21.251097 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:21 crc kubenswrapper[4725]: E0227 06:13:21.251650 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.254926 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:22 crc kubenswrapper[4725]: E0227 06:13:22.255682 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.274599 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b5a9f66-b388-4334-82c5-d3c8de8d86be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf3441d6447d94552764ecd4cb7154c351c3e39e31d7ad9320d72f3363d9a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa7002c98502a128a5603d04aad1b6ee162ecd42adf410cd3e5feab8d110fd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knvjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-phgmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.307411 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa18796a-e47d-40ff-bd1e-f0ebc2ae24e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e8eb19c146f735663bd9b890951674d635626bb8571fe421d34da55f575eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068fa71c1a4d412ae5f427ab36327d343c686034d21efe9749602b3d3091f936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://574690ada20f52bfea5efbf0dcc1aa44b2b2bf9c8645efa46d49165f091cad71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0487d3bcb46924158a513a9d8c49c56f38d00c5d6c66c0b42656a9c3d92808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://130d887368cfdc4738feacc13f62cb538eaad79093c5ed0391e7bc3c51cb7359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bef28b1a66dd8d500c4c3fac477e75522e49b3cfa5d967b849020c5c92baf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ce98f43405353672ef02e12feef5b979bd410296f7cf03f9770f41ef1ae37fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5488e5f62173fe26a52a2947dd2f06a2f12b9fd23dec6f85c6ac384ebd56ddb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T06:10:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T06:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:10:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.324397 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d55b8c33c5bee2053bcf7e3ec26cf4e9a25de650da73bb2fb5c351575557d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwrg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mg969\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.340137 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zpdks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dcaad94-78ba-48c7-aac6-9d8352419ed5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312376a69fd032dcd59a93a16bae67d952c5866c6f58c635294e9f3d5b29f0c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm4n5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zpdks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.358776 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:11:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eea4a11538f3593309088ba1a445448cb193fc761d4eaa664aab09428d7fe04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.375238 4725 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gn9fk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2572a7c-3003-4e40-a052-1718a5ef100d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T06:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86979d5fe3a1cbb79b2dcf3ca7903658877afae47345189209757e07a9773ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T06:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-whqnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T06:12:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gn9fk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T06:13:22Z is after 2025-08-24T17:21:41Z" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.417640 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gxjff" podStartSLOduration=118.417617339 podStartE2EDuration="1m58.417617339s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.416827906 +0000 UTC m=+180.879448515" watchObservedRunningTime="2026-02-27 06:13:22.417617339 +0000 UTC m=+180.880237948" Feb 27 06:13:22 crc kubenswrapper[4725]: E0227 06:13:22.450622 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.480023 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=86.479999668 podStartE2EDuration="1m26.479999668s" podCreationTimestamp="2026-02-27 06:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.479732381 +0000 UTC m=+180.942352990" watchObservedRunningTime="2026-02-27 06:13:22.479999668 +0000 UTC m=+180.942620247" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.566159 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=90.566142755 podStartE2EDuration="1m30.566142755s" podCreationTimestamp="2026-02-27 06:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.565458276 +0000 UTC m=+181.028078865" watchObservedRunningTime="2026-02-27 06:13:22.566142755 +0000 UTC m=+181.028763334" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.579424 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.579401587 podStartE2EDuration="39.579401587s" podCreationTimestamp="2026-02-27 06:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.578497861 +0000 UTC m=+181.041118450" watchObservedRunningTime="2026-02-27 06:13:22.579401587 +0000 UTC m=+181.042022176" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.609735 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.609766 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.609774 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.609787 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.609796 4725 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T06:13:22Z","lastTransitionTime":"2026-02-27T06:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.624757 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g8jqm" podStartSLOduration=118.624735997 podStartE2EDuration="1m58.624735997s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.612086783 +0000 UTC m=+181.074707402" watchObservedRunningTime="2026-02-27 06:13:22.624735997 +0000 UTC m=+181.087356576" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.654642 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl"] Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.655104 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=39.655080708 podStartE2EDuration="39.655080708s" podCreationTimestamp="2026-02-27 06:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.654945285 +0000 UTC m=+181.117565874" watchObservedRunningTime="2026-02-27 06:13:22.655080708 +0000 UTC m=+181.117701297" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.655473 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.657211 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.657369 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.658577 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.658603 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.737982 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gn9fk" podStartSLOduration=118.737959203 podStartE2EDuration="1m58.737959203s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.727812569 +0000 UTC m=+181.190433178" watchObservedRunningTime="2026-02-27 06:13:22.737959203 +0000 UTC m=+181.200579772" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.766363 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=83.766341569 podStartE2EDuration="1m23.766341569s" podCreationTimestamp="2026-02-27 06:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.765159646 +0000 UTC m=+181.227780275" watchObservedRunningTime="2026-02-27 06:13:22.766341569 +0000 UTC m=+181.228962158" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.775868 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podStartSLOduration=118.775846906 podStartE2EDuration="1m58.775846906s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.775113835 +0000 UTC m=+181.237734464" watchObservedRunningTime="2026-02-27 06:13:22.775846906 +0000 UTC m=+181.238467475" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.782683 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4027b0ce-7803-4b6e-9659-c8ede71e38f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.782866 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4027b0ce-7803-4b6e-9659-c8ede71e38f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.782959 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4027b0ce-7803-4b6e-9659-c8ede71e38f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.783029 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4027b0ce-7803-4b6e-9659-c8ede71e38f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.783218 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4027b0ce-7803-4b6e-9659-c8ede71e38f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.789007 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zpdks" podStartSLOduration=118.788983704 podStartE2EDuration="1m58.788983704s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.786969268 +0000 UTC m=+181.249589867" watchObservedRunningTime="2026-02-27 06:13:22.788983704 +0000 UTC m=+181.251604313" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.803148 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-phgmk" podStartSLOduration=117.803130111 podStartE2EDuration="1m57.803130111s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:22.802427012 +0000 UTC m=+181.265047611" watchObservedRunningTime="2026-02-27 06:13:22.803130111 +0000 UTC m=+181.265750690" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.884346 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4027b0ce-7803-4b6e-9659-c8ede71e38f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.884882 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4027b0ce-7803-4b6e-9659-c8ede71e38f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.884498 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4027b0ce-7803-4b6e-9659-c8ede71e38f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.884997 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4027b0ce-7803-4b6e-9659-c8ede71e38f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.885529 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4027b0ce-7803-4b6e-9659-c8ede71e38f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.885833 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4027b0ce-7803-4b6e-9659-c8ede71e38f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.886110 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4027b0ce-7803-4b6e-9659-c8ede71e38f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.886837 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4027b0ce-7803-4b6e-9659-c8ede71e38f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.895662 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4027b0ce-7803-4b6e-9659-c8ede71e38f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.909483 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4027b0ce-7803-4b6e-9659-c8ede71e38f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cdxxl\" (UID: \"4027b0ce-7803-4b6e-9659-c8ede71e38f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: I0227 06:13:22.968950 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" Feb 27 06:13:22 crc kubenswrapper[4725]: W0227 06:13:22.990825 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4027b0ce_7803_4b6e_9659_c8ede71e38f6.slice/crio-483b2f0d84c01900f7761b5eba098bd19d269dc4a51a8643337334c73c670ccc WatchSource:0}: Error finding container 483b2f0d84c01900f7761b5eba098bd19d269dc4a51a8643337334c73c670ccc: Status 404 returned error can't find the container with id 483b2f0d84c01900f7761b5eba098bd19d269dc4a51a8643337334c73c670ccc Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.229099 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.240791 4725 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.251182 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.251265 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.251189 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:23 crc kubenswrapper[4725]: E0227 06:13:23.251490 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:23 crc kubenswrapper[4725]: E0227 06:13:23.251605 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:23 crc kubenswrapper[4725]: E0227 06:13:23.251837 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.462192 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" event={"ID":"4027b0ce-7803-4b6e-9659-c8ede71e38f6","Type":"ContainerStarted","Data":"3bb838b9d641e33023c0a8bc68b8aa29549c7901cf8174022bee22576579c7fc"} Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.462276 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" event={"ID":"4027b0ce-7803-4b6e-9659-c8ede71e38f6","Type":"ContainerStarted","Data":"483b2f0d84c01900f7761b5eba098bd19d269dc4a51a8643337334c73c670ccc"} Feb 27 06:13:23 crc kubenswrapper[4725]: I0227 06:13:23.486811 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cdxxl" podStartSLOduration=119.486786848 podStartE2EDuration="1m59.486786848s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:23.485149742 +0000 UTC m=+181.947770361" watchObservedRunningTime="2026-02-27 06:13:23.486786848 +0000 UTC m=+181.949407447" Feb 27 06:13:24 crc kubenswrapper[4725]: I0227 06:13:24.251411 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:24 crc kubenswrapper[4725]: E0227 06:13:24.251612 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:25 crc kubenswrapper[4725]: I0227 06:13:25.251469 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:25 crc kubenswrapper[4725]: I0227 06:13:25.251490 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:25 crc kubenswrapper[4725]: I0227 06:13:25.251490 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:25 crc kubenswrapper[4725]: E0227 06:13:25.251844 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:25 crc kubenswrapper[4725]: E0227 06:13:25.251919 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:25 crc kubenswrapper[4725]: E0227 06:13:25.251670 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:26 crc kubenswrapper[4725]: I0227 06:13:26.251127 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:26 crc kubenswrapper[4725]: E0227 06:13:26.251402 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:27 crc kubenswrapper[4725]: I0227 06:13:27.251631 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:27 crc kubenswrapper[4725]: I0227 06:13:27.251766 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:27 crc kubenswrapper[4725]: E0227 06:13:27.251836 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:27 crc kubenswrapper[4725]: I0227 06:13:27.251892 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:27 crc kubenswrapper[4725]: E0227 06:13:27.252063 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:27 crc kubenswrapper[4725]: E0227 06:13:27.252214 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:27 crc kubenswrapper[4725]: I0227 06:13:27.253713 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:13:27 crc kubenswrapper[4725]: E0227 06:13:27.254079 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lchm9_openshift-ovn-kubernetes(05a446dc-e501-4173-a911-7b33ca4608c6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" Feb 27 06:13:27 crc kubenswrapper[4725]: E0227 06:13:27.453096 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:28 crc kubenswrapper[4725]: I0227 06:13:28.251080 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:28 crc kubenswrapper[4725]: E0227 06:13:28.251671 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:29 crc kubenswrapper[4725]: I0227 06:13:29.251348 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:29 crc kubenswrapper[4725]: I0227 06:13:29.251450 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:29 crc kubenswrapper[4725]: I0227 06:13:29.251363 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:29 crc kubenswrapper[4725]: E0227 06:13:29.251549 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:29 crc kubenswrapper[4725]: E0227 06:13:29.251650 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:29 crc kubenswrapper[4725]: E0227 06:13:29.251770 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:30 crc kubenswrapper[4725]: I0227 06:13:30.251330 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:30 crc kubenswrapper[4725]: E0227 06:13:30.251539 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:31 crc kubenswrapper[4725]: I0227 06:13:31.250756 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:31 crc kubenswrapper[4725]: I0227 06:13:31.250798 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:31 crc kubenswrapper[4725]: I0227 06:13:31.250807 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:31 crc kubenswrapper[4725]: E0227 06:13:31.251083 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:31 crc kubenswrapper[4725]: E0227 06:13:31.251215 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:31 crc kubenswrapper[4725]: E0227 06:13:31.251575 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:32 crc kubenswrapper[4725]: I0227 06:13:32.250792 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:32 crc kubenswrapper[4725]: E0227 06:13:32.252705 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:32 crc kubenswrapper[4725]: E0227 06:13:32.454044 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:33 crc kubenswrapper[4725]: I0227 06:13:33.251225 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:33 crc kubenswrapper[4725]: I0227 06:13:33.251330 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:33 crc kubenswrapper[4725]: I0227 06:13:33.251243 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:33 crc kubenswrapper[4725]: E0227 06:13:33.251490 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:33 crc kubenswrapper[4725]: E0227 06:13:33.251659 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:33 crc kubenswrapper[4725]: E0227 06:13:33.251844 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:34 crc kubenswrapper[4725]: I0227 06:13:34.251625 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:34 crc kubenswrapper[4725]: E0227 06:13:34.252063 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.251060 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.251119 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.251183 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:35 crc kubenswrapper[4725]: E0227 06:13:35.251334 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:35 crc kubenswrapper[4725]: E0227 06:13:35.251453 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:35 crc kubenswrapper[4725]: E0227 06:13:35.251675 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.515351 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/1.log" Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.516160 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/0.log" Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.516256 4725 generic.go:334] "Generic (PLEG): container finished" podID="7439e599-9b13-45e6-8f71-ef3760b2235b" containerID="275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd" exitCode=1 Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.516327 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerDied","Data":"275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd"} Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.516374 4725 scope.go:117] "RemoveContainer" containerID="41bc5713551ca4398de3630e166882c8278699528693e95856441bc12ad152b7" Feb 27 06:13:35 crc kubenswrapper[4725]: I0227 06:13:35.517095 4725 scope.go:117] "RemoveContainer" containerID="275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd" Feb 27 06:13:35 crc kubenswrapper[4725]: E0227 06:13:35.517442 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-g8jqm_openshift-multus(7439e599-9b13-45e6-8f71-ef3760b2235b)\"" pod="openshift-multus/multus-g8jqm" podUID="7439e599-9b13-45e6-8f71-ef3760b2235b" Feb 27 06:13:36 crc kubenswrapper[4725]: I0227 06:13:36.250602 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:36 crc kubenswrapper[4725]: E0227 06:13:36.250862 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:36 crc kubenswrapper[4725]: I0227 06:13:36.522236 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/1.log" Feb 27 06:13:37 crc kubenswrapper[4725]: I0227 06:13:37.251243 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:37 crc kubenswrapper[4725]: I0227 06:13:37.251322 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:37 crc kubenswrapper[4725]: I0227 06:13:37.251255 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:37 crc kubenswrapper[4725]: E0227 06:13:37.251469 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:37 crc kubenswrapper[4725]: E0227 06:13:37.251711 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:37 crc kubenswrapper[4725]: E0227 06:13:37.251767 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:37 crc kubenswrapper[4725]: E0227 06:13:37.456179 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:38 crc kubenswrapper[4725]: I0227 06:13:38.251408 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:38 crc kubenswrapper[4725]: E0227 06:13:38.251667 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:39 crc kubenswrapper[4725]: I0227 06:13:39.251330 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:39 crc kubenswrapper[4725]: I0227 06:13:39.251382 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:39 crc kubenswrapper[4725]: I0227 06:13:39.251370 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:39 crc kubenswrapper[4725]: E0227 06:13:39.251507 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:39 crc kubenswrapper[4725]: E0227 06:13:39.251743 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:39 crc kubenswrapper[4725]: E0227 06:13:39.251823 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:40 crc kubenswrapper[4725]: I0227 06:13:40.250865 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:40 crc kubenswrapper[4725]: E0227 06:13:40.251495 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:40 crc kubenswrapper[4725]: I0227 06:13:40.252133 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:13:40 crc kubenswrapper[4725]: I0227 06:13:40.542117 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/3.log" Feb 27 06:13:40 crc kubenswrapper[4725]: I0227 06:13:40.547103 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerStarted","Data":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} Feb 27 06:13:40 crc kubenswrapper[4725]: I0227 06:13:40.548084 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:13:41 crc kubenswrapper[4725]: I0227 06:13:41.250648 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:41 crc kubenswrapper[4725]: I0227 06:13:41.250713 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:41 crc kubenswrapper[4725]: E0227 06:13:41.250783 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:41 crc kubenswrapper[4725]: E0227 06:13:41.250912 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:41 crc kubenswrapper[4725]: I0227 06:13:41.251033 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:41 crc kubenswrapper[4725]: E0227 06:13:41.251141 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:41 crc kubenswrapper[4725]: I0227 06:13:41.416960 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podStartSLOduration=136.416937918 podStartE2EDuration="2m16.416937918s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:40.586326724 +0000 UTC m=+199.048947383" watchObservedRunningTime="2026-02-27 06:13:41.416937918 +0000 UTC m=+199.879558507" Feb 27 06:13:41 crc kubenswrapper[4725]: I0227 06:13:41.417550 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vcl2g"] Feb 27 06:13:41 crc kubenswrapper[4725]: I0227 06:13:41.550228 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:41 crc kubenswrapper[4725]: E0227 06:13:41.550462 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:42 crc kubenswrapper[4725]: I0227 06:13:42.251080 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:42 crc kubenswrapper[4725]: E0227 06:13:42.253398 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:42 crc kubenswrapper[4725]: E0227 06:13:42.457973 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:43 crc kubenswrapper[4725]: I0227 06:13:43.250874 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:43 crc kubenswrapper[4725]: I0227 06:13:43.250886 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:43 crc kubenswrapper[4725]: E0227 06:13:43.251119 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:43 crc kubenswrapper[4725]: E0227 06:13:43.251195 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:43 crc kubenswrapper[4725]: I0227 06:13:43.250921 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:43 crc kubenswrapper[4725]: E0227 06:13:43.251402 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:44 crc kubenswrapper[4725]: I0227 06:13:44.251002 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:44 crc kubenswrapper[4725]: E0227 06:13:44.251338 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:45 crc kubenswrapper[4725]: I0227 06:13:45.250773 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:45 crc kubenswrapper[4725]: I0227 06:13:45.250825 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:45 crc kubenswrapper[4725]: I0227 06:13:45.250780 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:45 crc kubenswrapper[4725]: E0227 06:13:45.250995 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:45 crc kubenswrapper[4725]: E0227 06:13:45.251321 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:45 crc kubenswrapper[4725]: E0227 06:13:45.251179 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:46 crc kubenswrapper[4725]: I0227 06:13:46.251056 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:46 crc kubenswrapper[4725]: E0227 06:13:46.251247 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:47 crc kubenswrapper[4725]: I0227 06:13:47.251510 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:47 crc kubenswrapper[4725]: I0227 06:13:47.251624 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:47 crc kubenswrapper[4725]: I0227 06:13:47.251659 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:47 crc kubenswrapper[4725]: E0227 06:13:47.251834 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:47 crc kubenswrapper[4725]: E0227 06:13:47.251968 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:47 crc kubenswrapper[4725]: E0227 06:13:47.252124 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:47 crc kubenswrapper[4725]: E0227 06:13:47.459726 4725 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:13:48 crc kubenswrapper[4725]: I0227 06:13:48.251089 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:48 crc kubenswrapper[4725]: E0227 06:13:48.251226 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:49 crc kubenswrapper[4725]: I0227 06:13:49.250767 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:49 crc kubenswrapper[4725]: I0227 06:13:49.250824 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:49 crc kubenswrapper[4725]: I0227 06:13:49.250767 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:49 crc kubenswrapper[4725]: E0227 06:13:49.250959 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:49 crc kubenswrapper[4725]: E0227 06:13:49.251092 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:49 crc kubenswrapper[4725]: E0227 06:13:49.251176 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:50 crc kubenswrapper[4725]: I0227 06:13:50.250703 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:50 crc kubenswrapper[4725]: E0227 06:13:50.250997 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:50 crc kubenswrapper[4725]: I0227 06:13:50.251648 4725 scope.go:117] "RemoveContainer" containerID="275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd" Feb 27 06:13:50 crc kubenswrapper[4725]: I0227 06:13:50.588609 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/1.log" Feb 27 06:13:50 crc kubenswrapper[4725]: I0227 06:13:50.588960 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerStarted","Data":"e0f1f8817193a70ed97103d3c02f10cc027d4cd5706eedd949743fced02e5989"} Feb 27 06:13:51 crc kubenswrapper[4725]: I0227 06:13:51.250451 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:51 crc kubenswrapper[4725]: I0227 06:13:51.250528 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:51 crc kubenswrapper[4725]: E0227 06:13:51.250649 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 06:13:51 crc kubenswrapper[4725]: I0227 06:13:51.250734 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:51 crc kubenswrapper[4725]: E0227 06:13:51.250929 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 06:13:51 crc kubenswrapper[4725]: E0227 06:13:51.251079 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vcl2g" podUID="ea1b7fec-c4c1-4ae5-a74a-8396d6428900" Feb 27 06:13:52 crc kubenswrapper[4725]: I0227 06:13:52.250849 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:52 crc kubenswrapper[4725]: E0227 06:13:52.253363 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.175006 4725 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.250879 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.251412 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-27d6j"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.251537 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.252509 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.252649 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.292988 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.295367 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.295795 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.295806 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn747"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.296281 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.296632 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.297041 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.298949 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zbhxj"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.299606 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.299908 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.300084 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.301538 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ndprs"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.301918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.302082 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-89pl9"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.302887 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.312229 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.312774 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.313087 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.313377 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.316401 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r5pzq"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.317142 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.317889 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.318509 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.321588 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.322323 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.323244 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.323417 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.323644 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.330957 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l686v"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.331580 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-qr9hb"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.331709 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.331935 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.334760 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-b58nc"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.335066 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b58nc" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.353671 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.353924 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.365753 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.365889 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.366736 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.367264 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.368531 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.368668 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.368664 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.368890 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.371409 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.372342 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.372568 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.372690 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.374257 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-audit\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.374353 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-image-import-ca\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.374463 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdkhg\" (UniqueName: \"kubernetes.io/projected/711e69a7-689f-47d6-840e-90ca3779ce5a-kube-api-access-hdkhg\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.374496 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/711e69a7-689f-47d6-840e-90ca3779ce5a-node-pullsecrets\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.374549 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-encryption-config\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.375782 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-config\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.376126 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-etcd-client\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.380061 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.382595 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-etcd-serving-ca\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.382631 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-serving-cert\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.382691 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/711e69a7-689f-47d6-840e-90ca3779ce5a-audit-dir\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.396599 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.396802 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.398418 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.398560 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.398691 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.398921 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.399665 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.399787 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.399906 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.400030 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.400137 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.400241 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.400365 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.400719 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.404526 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.404999 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.405234 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.405986 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.407089 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.407738 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.408602 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.408755 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.409011 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.409160 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.409411 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.409632 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.409822 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.410238 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.410749 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.410971 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.411083 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.411464 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mc28q"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.412001 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.413695 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.413766 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.414026 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.414091 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.414028 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.415496 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.416594 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.418225 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.418427 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.418586 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.418629 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.418812 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.418992 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.419306 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.419458 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.419616 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.419701 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.419787 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.419846 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.419988 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.420109 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.420213 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.420322 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.420419 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.421100 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.421261 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.421488 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.421530 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.421814 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.421835 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.421263 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.422162 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.423010 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.424637 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.424920 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p26pd"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.426652 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.426947 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.426959 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.427104 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.427239 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.428682 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.428998 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.429240 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.431072 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.431179 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.431261 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.431380 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.431463 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.432175 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.433249 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.433689 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.437089 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-28snx"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.439949 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.440941 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.441127 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.441843 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.451729 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.452810 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.452938 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.453329 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.456476 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.464936 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.466361 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.475904 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.480594 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.480849 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.481040 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.481722 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.482039 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.482190 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.483461 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.483748 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.484618 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.484787 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.485495 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.485761 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.488538 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.489276 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.496953 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.498734 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/711e69a7-689f-47d6-840e-90ca3779ce5a-audit-dir\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.498793 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9v749"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.498829 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/711e69a7-689f-47d6-840e-90ca3779ce5a-audit-dir\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.498867 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scd7x\" (UniqueName: \"kubernetes.io/projected/6e178501-6e8d-4fcd-abf4-b13cd287501b-kube-api-access-scd7x\") pod \"downloads-7954f5f757-b58nc\" (UID: \"6e178501-6e8d-4fcd-abf4-b13cd287501b\") " pod="openshift-console/downloads-7954f5f757-b58nc" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.498919 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-serving-cert\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.498954 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-trusted-ca-bundle\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499098 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499136 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h487b\" (UniqueName: \"kubernetes.io/projected/e0a18ecb-f59a-412e-b224-0bdbd115bd90-kube-api-access-h487b\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499154 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ae065a-f626-466b-a56e-089250cf7405-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499233 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-audit\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499276 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-machine-approver-tls\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499357 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eda518bc-5412-413a-8958-ea97c24a9795-serving-cert\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499511 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499858 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/682c856f-0661-4039-b071-e5c75267f3f1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499946 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.499977 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/682c856f-0661-4039-b071-e5c75267f3f1-images\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500030 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb380083-e346-435c-8290-ab59f1a1d190-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500083 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-config\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500109 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500141 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-config\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500158 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-audit\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500168 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brjww\" (UniqueName: \"kubernetes.io/projected/b0dcb291-e867-45d5-91f3-fa9b18a090c5-kube-api-access-brjww\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500191 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-client-ca\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500212 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500231 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7ks\" (UniqueName: \"kubernetes.io/projected/4ac52b3f-d9d6-499e-b0e7-d2fb07de2780-kube-api-access-4r7ks\") pod \"dns-operator-744455d44c-ndprs\" (UID: \"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780\") " pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500740 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500798 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-service-ca\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500825 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500849 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500867 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ac52b3f-d9d6-499e-b0e7-d2fb07de2780-metrics-tls\") pod \"dns-operator-744455d44c-ndprs\" (UID: \"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780\") " pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500886 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-config\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500910 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-config\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500926 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-etcd-client\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500949 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500969 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-policies\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.500991 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-serving-cert\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501010 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501029 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501048 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ae065a-f626-466b-a56e-089250cf7405-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501093 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-config\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501114 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-747c8\" (UniqueName: \"kubernetes.io/projected/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-kube-api-access-747c8\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501130 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501146 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/96ae065a-f626-466b-a56e-089250cf7405-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501167 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501189 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682c856f-0661-4039-b071-e5c75267f3f1-config\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501210 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-image-import-ca\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501225 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-dir\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501244 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-oauth-serving-cert\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501264 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9904efe3-73fc-4efd-ac2c-39a0653315ba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zclkw\" (UID: \"9904efe3-73fc-4efd-ac2c-39a0653315ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501340 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbh79\" (UniqueName: \"kubernetes.io/projected/eda518bc-5412-413a-8958-ea97c24a9795-kube-api-access-sbh79\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501364 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-service-ca-bundle\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501407 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501427 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74x47\" (UniqueName: \"kubernetes.io/projected/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-kube-api-access-74x47\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501448 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-trusted-ca\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501464 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-oauth-config\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501486 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdkhg\" (UniqueName: \"kubernetes.io/projected/711e69a7-689f-47d6-840e-90ca3779ce5a-kube-api-access-hdkhg\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501503 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpjhr\" (UniqueName: \"kubernetes.io/projected/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-kube-api-access-hpjhr\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501717 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501735 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcv6w\" (UniqueName: \"kubernetes.io/projected/682c856f-0661-4039-b071-e5c75267f3f1-kube-api-access-bcv6w\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501759 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/711e69a7-689f-47d6-840e-90ca3779ce5a-node-pullsecrets\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501779 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jk2j\" (UniqueName: \"kubernetes.io/projected/9904efe3-73fc-4efd-ac2c-39a0653315ba-kube-api-access-2jk2j\") pod \"cluster-samples-operator-665b6dd947-zclkw\" (UID: \"9904efe3-73fc-4efd-ac2c-39a0653315ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501807 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-encryption-config\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501825 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-serving-cert\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501845 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-auth-proxy-config\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501863 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-config\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501883 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeead09f-faab-4a7f-8aed-7a7823d350bc-serving-cert\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501903 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdnq\" (UniqueName: \"kubernetes.io/projected/aeead09f-faab-4a7f-8aed-7a7823d350bc-kube-api-access-7pdnq\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.501927 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp58c\" (UniqueName: \"kubernetes.io/projected/fb380083-e346-435c-8290-ab59f1a1d190-kube-api-access-tp58c\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.502520 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-config\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.502714 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmwz5"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.503271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb380083-e346-435c-8290-ab59f1a1d190-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.503340 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.503359 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-etcd-serving-ca\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.503382 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.503399 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4bgw\" (UniqueName: \"kubernetes.io/projected/96ae065a-f626-466b-a56e-089250cf7405-kube-api-access-g4bgw\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.503728 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.502719 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/711e69a7-689f-47d6-840e-90ca3779ce5a-node-pullsecrets\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.503279 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-image-import-ca\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.504861 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-etcd-serving-ca\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.507856 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.508585 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.509112 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.507906 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711e69a7-689f-47d6-840e-90ca3779ce5a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.509212 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.509273 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.512647 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-serving-cert\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.515215 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-etcd-client\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.516197 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/711e69a7-689f-47d6-840e-90ca3779ce5a-encryption-config\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.517046 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.517353 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.517433 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.517808 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.517843 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.517955 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.519197 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.519659 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-g7kxx"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.520054 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.520438 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536212-627nh"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.536542 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.538646 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.539166 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.539658 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.539931 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.540622 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.545997 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l686v"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.546038 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zbhxj"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.546054 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.546073 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn747"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.546086 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.546165 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536212-627nh" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.549355 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-89pl9"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.549439 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r5pzq"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.549453 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.549570 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.552344 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64j8f"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.553649 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.554458 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.554736 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pmdnc"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.555460 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.556655 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.557610 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.560990 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-27d6j"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.563246 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.567999 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b58nc"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.570721 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.571120 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.572939 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qr9hb"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.574517 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p26pd"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.574598 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.576103 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.577981 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.579168 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.581010 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.581957 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ndprs"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.583271 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mc28q"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.590462 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.590507 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.591676 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.592834 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.592971 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.595426 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmwz5"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.595488 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ftrcl"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.596218 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-p5m2x"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.596557 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.596813 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.597759 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.598955 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-g7kxx"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.600102 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pmdnc"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.601145 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536212-627nh"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.602269 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.603408 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9v749"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.604140 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcv6w\" (UniqueName: \"kubernetes.io/projected/682c856f-0661-4039-b071-e5c75267f3f1-kube-api-access-bcv6w\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.604177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jk2j\" (UniqueName: \"kubernetes.io/projected/9904efe3-73fc-4efd-ac2c-39a0653315ba-kube-api-access-2jk2j\") pod \"cluster-samples-operator-665b6dd947-zclkw\" (UID: \"9904efe3-73fc-4efd-ac2c-39a0653315ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.604207 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-serving-cert\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.604239 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-auth-proxy-config\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.605036 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-auth-proxy-config\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.605114 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-config\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.605142 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeead09f-faab-4a7f-8aed-7a7823d350bc-serving-cert\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.605168 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdnq\" (UniqueName: \"kubernetes.io/projected/aeead09f-faab-4a7f-8aed-7a7823d350bc-kube-api-access-7pdnq\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.605199 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp58c\" (UniqueName: \"kubernetes.io/projected/fb380083-e346-435c-8290-ab59f1a1d190-kube-api-access-tp58c\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.605271 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb380083-e346-435c-8290-ab59f1a1d190-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.605720 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.607252 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.606513 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-config\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.606060 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb380083-e346-435c-8290-ab59f1a1d190-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.607180 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.607857 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4bgw\" (UniqueName: \"kubernetes.io/projected/96ae065a-f626-466b-a56e-089250cf7405-kube-api-access-g4bgw\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.607881 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scd7x\" (UniqueName: \"kubernetes.io/projected/6e178501-6e8d-4fcd-abf4-b13cd287501b-kube-api-access-scd7x\") pod \"downloads-7954f5f757-b58nc\" (UID: \"6e178501-6e8d-4fcd-abf4-b13cd287501b\") " pod="openshift-console/downloads-7954f5f757-b58nc" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.607907 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-serving-cert\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.607931 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-trusted-ca-bundle\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.608121 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.608174 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h487b\" (UniqueName: \"kubernetes.io/projected/e0a18ecb-f59a-412e-b224-0bdbd115bd90-kube-api-access-h487b\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.608270 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64j8f"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609203 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609325 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ae065a-f626-466b-a56e-089250cf7405-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609350 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-trusted-ca-bundle\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609478 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ae065a-f626-466b-a56e-089250cf7405-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609530 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-trusted-ca\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609558 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-machine-approver-tls\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609634 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eda518bc-5412-413a-8958-ea97c24a9795-serving-cert\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619502 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/682c856f-0661-4039-b071-e5c75267f3f1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619567 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619605 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/682c856f-0661-4039-b071-e5c75267f3f1-images\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619655 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb380083-e346-435c-8290-ab59f1a1d190-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619706 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-config\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619748 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619774 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-config\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619804 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njfpc\" (UniqueName: \"kubernetes.io/projected/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-kube-api-access-njfpc\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619834 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brjww\" (UniqueName: \"kubernetes.io/projected/b0dcb291-e867-45d5-91f3-fa9b18a090c5-kube-api-access-brjww\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619860 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-client-ca\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619884 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619904 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7ks\" (UniqueName: \"kubernetes.io/projected/4ac52b3f-d9d6-499e-b0e7-d2fb07de2780-kube-api-access-4r7ks\") pod \"dns-operator-744455d44c-ndprs\" (UID: \"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780\") " pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619925 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-machine-approver-tls\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.609838 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ftrcl"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.619930 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620040 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-serving-cert\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620067 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620113 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-metrics-tls\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620168 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-service-ca\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620228 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620274 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620340 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ac52b3f-d9d6-499e-b0e7-d2fb07de2780-metrics-tls\") pod \"dns-operator-744455d44c-ndprs\" (UID: \"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780\") " pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620342 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620377 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-config\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620422 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-policies\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620503 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620543 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620573 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ae065a-f626-466b-a56e-089250cf7405-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620610 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-config\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620645 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-747c8\" (UniqueName: \"kubernetes.io/projected/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-kube-api-access-747c8\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620678 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620684 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620720 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/96ae065a-f626-466b-a56e-089250cf7405-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620759 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620793 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682c856f-0661-4039-b071-e5c75267f3f1-config\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620827 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-dir\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620901 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-oauth-serving-cert\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620933 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9904efe3-73fc-4efd-ac2c-39a0653315ba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zclkw\" (UID: \"9904efe3-73fc-4efd-ac2c-39a0653315ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620966 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbh79\" (UniqueName: \"kubernetes.io/projected/eda518bc-5412-413a-8958-ea97c24a9795-kube-api-access-sbh79\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621000 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzps5\" (UniqueName: \"kubernetes.io/projected/109c9025-3aff-4dc5-a351-c6686c336671-kube-api-access-bzps5\") pod \"migrator-59844c95c7-sj48h\" (UID: \"109c9025-3aff-4dc5-a351-c6686c336671\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621038 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-service-ca-bundle\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621098 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621136 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74x47\" (UniqueName: \"kubernetes.io/projected/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-kube-api-access-74x47\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621175 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-trusted-ca\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621210 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-oauth-config\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621253 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpjhr\" (UniqueName: \"kubernetes.io/projected/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-kube-api-access-hpjhr\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621323 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621895 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.622581 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-config\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.620649 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eda518bc-5412-413a-8958-ea97c24a9795-serving-cert\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.622754 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-config\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.622811 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.622942 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/682c856f-0661-4039-b071-e5c75267f3f1-images\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.616197 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeead09f-faab-4a7f-8aed-7a7823d350bc-serving-cert\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.623408 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-policies\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.623608 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-config\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.624235 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb380083-e346-435c-8290-ab59f1a1d190-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.624778 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.624791 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682c856f-0661-4039-b071-e5c75267f3f1-config\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.625366 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.625774 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-client-ca\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.625824 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-config\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.625943 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/682c856f-0661-4039-b071-e5c75267f3f1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.626563 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.626585 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-oauth-serving-cert\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.626659 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-dir\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.626789 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.627512 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.627592 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-service-ca\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.627853 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.628659 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeead09f-faab-4a7f-8aed-7a7823d350bc-service-ca-bundle\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.630098 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/96ae065a-f626-466b-a56e-089250cf7405-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.630208 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.630393 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ac52b3f-d9d6-499e-b0e7-d2fb07de2780-metrics-tls\") pod \"dns-operator-744455d44c-ndprs\" (UID: \"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780\") " pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.621135 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-serving-cert\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.631089 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.631122 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.631230 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-trusted-ca\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.632959 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-oauth-config\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.633182 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.633253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9904efe3-73fc-4efd-ac2c-39a0653315ba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zclkw\" (UID: \"9904efe3-73fc-4efd-ac2c-39a0653315ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.634046 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.634100 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.634333 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.636197 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.636500 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp"] Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.654011 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.674908 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.693734 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.715652 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.722749 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzps5\" (UniqueName: \"kubernetes.io/projected/109c9025-3aff-4dc5-a351-c6686c336671-kube-api-access-bzps5\") pod \"migrator-59844c95c7-sj48h\" (UID: \"109c9025-3aff-4dc5-a351-c6686c336671\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.722969 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-trusted-ca\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.723047 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njfpc\" (UniqueName: \"kubernetes.io/projected/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-kube-api-access-njfpc\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.723137 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-metrics-tls\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.723188 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.728135 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-metrics-tls\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.741253 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.746556 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-trusted-ca\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.753972 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.775749 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.793364 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.813474 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.834660 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.854952 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.873488 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.894576 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.913429 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.934275 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.955374 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.974254 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 06:13:53 crc kubenswrapper[4725]: I0227 06:13:53.994983 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.014687 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.033602 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.054632 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.073984 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.095064 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.115648 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.134163 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.154863 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.174680 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.193988 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.215366 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.235629 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.251030 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.255249 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.274498 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.294525 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.314143 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.335329 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.354388 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.374619 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.435735 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.454331 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.494627 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.506956 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdkhg\" (UniqueName: \"kubernetes.io/projected/711e69a7-689f-47d6-840e-90ca3779ce5a-kube-api-access-hdkhg\") pod \"apiserver-76f77b778f-27d6j\" (UID: \"711e69a7-689f-47d6-840e-90ca3779ce5a\") " pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.518476 4725 request.go:700] Waited for 1.014295024s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-dockercfg-5nsgg&limit=500&resourceVersion=0 Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.520403 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.524052 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.535797 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.576549 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.579566 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.596799 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.614961 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.634373 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.655644 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.718515 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.720704 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.721500 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.734117 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.754142 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.774914 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.794152 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.814525 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.833506 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.845846 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-27d6j"] Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.853995 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 06:13:54 crc kubenswrapper[4725]: W0227 06:13:54.855627 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711e69a7_689f_47d6_840e_90ca3779ce5a.slice/crio-666de51a2747a2c1d729e544dccde298e48bbede438e157781c12d233e3eac7c WatchSource:0}: Error finding container 666de51a2747a2c1d729e544dccde298e48bbede438e157781c12d233e3eac7c: Status 404 returned error can't find the container with id 666de51a2747a2c1d729e544dccde298e48bbede438e157781c12d233e3eac7c Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.874520 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.895550 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.913771 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.933398 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.953689 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.974695 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 06:13:54 crc kubenswrapper[4725]: I0227 06:13:54.994048 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.013909 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.034745 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.054731 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.074377 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.094031 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.115331 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.133864 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.153860 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.174506 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.195156 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.214084 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.234991 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.255211 4725 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.275230 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.295110 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.314075 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.334514 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.353979 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.374100 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.394210 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.414768 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.435590 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.454871 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.474607 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.527003 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcv6w\" (UniqueName: \"kubernetes.io/projected/682c856f-0661-4039-b071-e5c75267f3f1-kube-api-access-bcv6w\") pod \"machine-api-operator-5694c8668f-89pl9\" (UID: \"682c856f-0661-4039-b071-e5c75267f3f1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.532610 4725 request.go:700] Waited for 1.9272645s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.542231 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jk2j\" (UniqueName: \"kubernetes.io/projected/9904efe3-73fc-4efd-ac2c-39a0653315ba-kube-api-access-2jk2j\") pod \"cluster-samples-operator-665b6dd947-zclkw\" (UID: \"9904efe3-73fc-4efd-ac2c-39a0653315ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.563554 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdnq\" (UniqueName: \"kubernetes.io/projected/aeead09f-faab-4a7f-8aed-7a7823d350bc-kube-api-access-7pdnq\") pod \"authentication-operator-69f744f599-l686v\" (UID: \"aeead09f-faab-4a7f-8aed-7a7823d350bc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.584162 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.586567 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp58c\" (UniqueName: \"kubernetes.io/projected/fb380083-e346-435c-8290-ab59f1a1d190-kube-api-access-tp58c\") pod \"openshift-apiserver-operator-796bbdcf4f-mvds5\" (UID: \"fb380083-e346-435c-8290-ab59f1a1d190\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.606063 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4bgw\" (UniqueName: \"kubernetes.io/projected/96ae065a-f626-466b-a56e-089250cf7405-kube-api-access-g4bgw\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.607934 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.618637 4725 generic.go:334] "Generic (PLEG): container finished" podID="711e69a7-689f-47d6-840e-90ca3779ce5a" containerID="cfa483614137f79ee231dd8074393e6c66ec16a48747bf769761e8bc8c0d361c" exitCode=0 Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.618756 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" event={"ID":"711e69a7-689f-47d6-840e-90ca3779ce5a","Type":"ContainerDied","Data":"cfa483614137f79ee231dd8074393e6c66ec16a48747bf769761e8bc8c0d361c"} Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.618847 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" event={"ID":"711e69a7-689f-47d6-840e-90ca3779ce5a","Type":"ContainerStarted","Data":"666de51a2747a2c1d729e544dccde298e48bbede438e157781c12d233e3eac7c"} Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.622193 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scd7x\" (UniqueName: \"kubernetes.io/projected/6e178501-6e8d-4fcd-abf4-b13cd287501b-kube-api-access-scd7x\") pod \"downloads-7954f5f757-b58nc\" (UID: \"6e178501-6e8d-4fcd-abf4-b13cd287501b\") " pod="openshift-console/downloads-7954f5f757-b58nc" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.641812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h487b\" (UniqueName: \"kubernetes.io/projected/e0a18ecb-f59a-412e-b224-0bdbd115bd90-kube-api-access-h487b\") pod \"oauth-openshift-558db77b4-zbhxj\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.653926 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brjww\" (UniqueName: \"kubernetes.io/projected/b0dcb291-e867-45d5-91f3-fa9b18a090c5-kube-api-access-brjww\") pod \"console-f9d7485db-qr9hb\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.657341 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b58nc" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.685052 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-747c8\" (UniqueName: \"kubernetes.io/projected/5aa28370-7d59-4c1c-aac9-582c6ca3da2f-kube-api-access-747c8\") pod \"console-operator-58897d9998-r5pzq\" (UID: \"5aa28370-7d59-4c1c-aac9-582c6ca3da2f\") " pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.712065 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbh79\" (UniqueName: \"kubernetes.io/projected/eda518bc-5412-413a-8958-ea97c24a9795-kube-api-access-sbh79\") pod \"controller-manager-879f6c89f-bn747\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.722169 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ae065a-f626-466b-a56e-089250cf7405-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7cjxg\" (UID: \"96ae065a-f626-466b-a56e-089250cf7405\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.743570 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7ks\" (UniqueName: \"kubernetes.io/projected/4ac52b3f-d9d6-499e-b0e7-d2fb07de2780-kube-api-access-4r7ks\") pod \"dns-operator-744455d44c-ndprs\" (UID: \"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780\") " pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.755360 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74x47\" (UniqueName: \"kubernetes.io/projected/3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d-kube-api-access-74x47\") pod \"openshift-controller-manager-operator-756b6f6bc6-glpkv\" (UID: \"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.769860 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.773323 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpjhr\" (UniqueName: \"kubernetes.io/projected/89db0909-19ba-4b2f-adaf-9bc42c9efd1d-kube-api-access-hpjhr\") pod \"machine-approver-56656f9798-bgb8c\" (UID: \"89db0909-19ba-4b2f-adaf-9bc42c9efd1d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.787214 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.793955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzps5\" (UniqueName: \"kubernetes.io/projected/109c9025-3aff-4dc5-a351-c6686c336671-kube-api-access-bzps5\") pod \"migrator-59844c95c7-sj48h\" (UID: \"109c9025-3aff-4dc5-a351-c6686c336671\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.801426 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.805502 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.813166 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njfpc\" (UniqueName: \"kubernetes.io/projected/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-kube-api-access-njfpc\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.822562 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.827339 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.841017 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1002e916-f907-4ae4-bc0f-3a08f4b70f5f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c8mzb\" (UID: \"1002e916-f907-4ae4-bc0f-3a08f4b70f5f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.842320 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.845714 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw"] Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.863702 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" Feb 27 06:13:55 crc kubenswrapper[4725]: W0227 06:13:55.864134 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89db0909_19ba_4b2f_adaf_9bc42c9efd1d.slice/crio-dfac07a2e4d602f850287756f66d0573132c9b50b561229088155f51be309d0c WatchSource:0}: Error finding container dfac07a2e4d602f850287756f66d0573132c9b50b561229088155f51be309d0c: Status 404 returned error can't find the container with id dfac07a2e4d602f850287756f66d0573132c9b50b561229088155f51be309d0c Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.886758 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l686v"] Feb 27 06:13:55 crc kubenswrapper[4725]: W0227 06:13:55.942784 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeead09f_faab_4a7f_8aed_7a7823d350bc.slice/crio-f70991308c3de9e489864c86b8261a1cb28b23f5b65d616618a5e1897835e575 WatchSource:0}: Error finding container f70991308c3de9e489864c86b8261a1cb28b23f5b65d616618a5e1897835e575: Status 404 returned error can't find the container with id f70991308c3de9e489864c86b8261a1cb28b23f5b65d616618a5e1897835e575 Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.945591 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958233 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6931a0dd-a354-441f-ae2c-3d6c3e59777e-serving-cert\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/affbff56-7e60-4b68-9efa-eb610da84f54-config\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958313 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0d4003-2819-4304-ad4f-0815dd53db79-config\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958351 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958374 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2aec175b-6e2b-4eac-a94f-771881386ffc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x796f\" (UID: \"2aec175b-6e2b-4eac-a94f-771881386ffc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958409 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-tls\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958461 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-ca\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958519 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40ad543f-d10f-4d38-bf97-fc1fd668e30f-service-ca-bundle\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958539 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-serving-cert\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958557 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-serving-cert\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958577 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/affbff56-7e60-4b68-9efa-eb610da84f54-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958614 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-encryption-config\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958632 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23b57c5c-dc24-40cd-8563-9e43ddc844dc-serving-cert\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958687 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-config\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958708 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf9pq\" (UniqueName: \"kubernetes.io/projected/6931a0dd-a354-441f-ae2c-3d6c3e59777e-kube-api-access-cf9pq\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958723 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-client\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958761 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cfc908de-22b4-4ca7-a60e-1ed8592f563e-audit-dir\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958783 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea0d4003-2819-4304-ad4f-0815dd53db79-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958816 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958837 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-bound-sa-token\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958858 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnnwp\" (UniqueName: \"kubernetes.io/projected/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-kube-api-access-bnnwp\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958891 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958941 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/23b57c5c-dc24-40cd-8563-9e43ddc844dc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958964 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-trusted-ca\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.958986 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05387b29-effd-4fb4-9bd4-21acd9989f76-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959009 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx76n\" (UniqueName: \"kubernetes.io/projected/cfc908de-22b4-4ca7-a60e-1ed8592f563e-kube-api-access-wx76n\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959030 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05387b29-effd-4fb4-9bd4-21acd9989f76-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959051 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05387b29-effd-4fb4-9bd4-21acd9989f76-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959070 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-service-ca\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959096 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-etcd-client\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959156 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-stats-auth\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959188 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-certificates\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959205 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66cj\" (UniqueName: \"kubernetes.io/projected/40ad543f-d10f-4d38-bf97-fc1fd668e30f-kube-api-access-w66cj\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959234 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: E0227 06:13:55.963619 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:56.463601528 +0000 UTC m=+214.926222097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.967102 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b58nc"] Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.967561 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.959278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkm57\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-kube-api-access-kkm57\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968460 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0d4003-2819-4304-ad4f-0815dd53db79-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968524 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968548 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-audit-policies\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968569 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-default-certificate\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968606 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/affbff56-7e60-4b68-9efa-eb610da84f54-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968680 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d986p\" (UniqueName: \"kubernetes.io/projected/23b57c5c-dc24-40cd-8563-9e43ddc844dc-kube-api-access-d986p\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968709 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-config\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968749 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-metrics-certs\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968794 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mrg7\" (UniqueName: \"kubernetes.io/projected/2aec175b-6e2b-4eac-a94f-771881386ffc-kube-api-access-7mrg7\") pod \"control-plane-machine-set-operator-78cbb6b69f-x796f\" (UID: \"2aec175b-6e2b-4eac-a94f-771881386ffc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:55 crc kubenswrapper[4725]: I0227 06:13:55.968815 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-client-ca\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.006418 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.055076 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069466 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069628 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b4a7f10-c643-4e7c-b4c9-76bef06ce76d-cert\") pod \"ingress-canary-ftrcl\" (UID: \"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d\") " pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069652 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-encryption-config\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069668 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23b57c5c-dc24-40cd-8563-9e43ddc844dc-serving-cert\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069683 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-client\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069698 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf9pq\" (UniqueName: \"kubernetes.io/projected/6931a0dd-a354-441f-ae2c-3d6c3e59777e-kube-api-access-cf9pq\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069723 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-config\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069741 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-proxy-tls\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069757 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8j7\" (UniqueName: \"kubernetes.io/projected/7b4a7f10-c643-4e7c-b4c9-76bef06ce76d-kube-api-access-6b8j7\") pod \"ingress-canary-ftrcl\" (UID: \"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d\") " pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069772 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a24054f-e850-4d6b-b15e-e37115316230-serving-cert\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069787 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdbr\" (UniqueName: \"kubernetes.io/projected/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-kube-api-access-pfdbr\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069804 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069818 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pqv\" (UniqueName: \"kubernetes.io/projected/a7826377-b3f7-47a0-8967-04fa1243de5f-kube-api-access-52pqv\") pod \"multus-admission-controller-857f4d67dd-9v749\" (UID: \"a7826377-b3f7-47a0-8967-04fa1243de5f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069834 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l5hm\" (UniqueName: \"kubernetes.io/projected/6b2da58a-3e24-4e72-a25d-eeee730910cd-kube-api-access-4l5hm\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069848 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a24054f-e850-4d6b-b15e-e37115316230-config\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069864 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cfc908de-22b4-4ca7-a60e-1ed8592f563e-audit-dir\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069879 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea0d4003-2819-4304-ad4f-0815dd53db79-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069895 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-metrics-tls\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069911 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgm7z\" (UniqueName: \"kubernetes.io/projected/49bf7d38-dc2d-4f9c-9050-202bb1e40747-kube-api-access-tgm7z\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069938 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069954 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-bound-sa-token\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069969 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2v98\" (UniqueName: \"kubernetes.io/projected/234512e0-3471-4bd8-b783-6df7b63f2cfe-kube-api-access-s2v98\") pod \"auto-csr-approver-29536212-627nh\" (UID: \"234512e0-3471-4bd8-b783-6df7b63f2cfe\") " pod="openshift-infra/auto-csr-approver-29536212-627nh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.069986 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070000 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-mountpoint-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070016 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnnwp\" (UniqueName: \"kubernetes.io/projected/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-kube-api-access-bnnwp\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070032 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlc6k\" (UniqueName: \"kubernetes.io/projected/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-kube-api-access-qlc6k\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070050 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-tmpfs\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070067 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070082 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-registration-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070096 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4zz\" (UniqueName: \"kubernetes.io/projected/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-kube-api-access-7c4zz\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070111 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-config-volume\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070145 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/23b57c5c-dc24-40cd-8563-9e43ddc844dc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070161 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snc2r\" (UniqueName: \"kubernetes.io/projected/ec8abb87-07ac-4227-a787-6db444002c0c-kube-api-access-snc2r\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-trusted-ca\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070193 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05387b29-effd-4fb4-9bd4-21acd9989f76-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070209 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5btl\" (UniqueName: \"kubernetes.io/projected/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-kube-api-access-q5btl\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070233 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx76n\" (UniqueName: \"kubernetes.io/projected/cfc908de-22b4-4ca7-a60e-1ed8592f563e-kube-api-access-wx76n\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070249 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05387b29-effd-4fb4-9bd4-21acd9989f76-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070265 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ec8abb87-07ac-4227-a787-6db444002c0c-signing-cabundle\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070280 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5rm\" (UniqueName: \"kubernetes.io/projected/04f492bb-9555-42d8-9e3a-1e419c3d7607-kube-api-access-kq5rm\") pod \"package-server-manager-789f6589d5-497vv\" (UID: \"04f492bb-9555-42d8-9e3a-1e419c3d7607\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070341 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05387b29-effd-4fb4-9bd4-21acd9989f76-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070361 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-service-ca\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070377 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-webhook-cert\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070393 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-etcd-client\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070408 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cd133d9f-4963-42d4-bd73-f09c510629cc-srv-cert\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070425 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-srv-cert\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070441 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf5jk\" (UniqueName: \"kubernetes.io/projected/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-kube-api-access-tf5jk\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070484 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070500 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d1908a5-f826-4eae-a6fa-c899dda28b57-secret-volume\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070517 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-stats-auth\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070541 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-certificates\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070556 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w66cj\" (UniqueName: \"kubernetes.io/projected/40ad543f-d10f-4d38-bf97-fc1fd668e30f-kube-api-access-w66cj\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070620 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49bf7d38-dc2d-4f9c-9050-202bb1e40747-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070665 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0d4003-2819-4304-ad4f-0815dd53db79-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070732 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070787 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkm57\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-kube-api-access-kkm57\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070815 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070856 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-audit-policies\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070881 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-default-certificate\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070909 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/affbff56-7e60-4b68-9efa-eb610da84f54-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070952 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7826377-b3f7-47a0-8967-04fa1243de5f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9v749\" (UID: \"a7826377-b3f7-47a0-8967-04fa1243de5f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070970 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d1908a5-f826-4eae-a6fa-c899dda28b57-config-volume\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.070991 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-socket-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.071030 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-config\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.071053 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f492bb-9555-42d8-9e3a-1e419c3d7607-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-497vv\" (UID: \"04f492bb-9555-42d8-9e3a-1e419c3d7607\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.072167 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:56.572141555 +0000 UTC m=+215.034762124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.073131 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-config\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.073198 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cfc908de-22b4-4ca7-a60e-1ed8592f563e-audit-dir\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.073960 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.080455 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d986p\" (UniqueName: \"kubernetes.io/projected/23b57c5c-dc24-40cd-8563-9e43ddc844dc-kube-api-access-d986p\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.080519 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/49bf7d38-dc2d-4f9c-9050-202bb1e40747-images\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.080540 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-plugins-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.080568 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-metrics-certs\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.080608 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvcbf\" (UniqueName: \"kubernetes.io/projected/cd133d9f-4963-42d4-bd73-f09c510629cc-kube-api-access-xvcbf\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.080631 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mrg7\" (UniqueName: \"kubernetes.io/projected/2aec175b-6e2b-4eac-a94f-771881386ffc-kube-api-access-7mrg7\") pod \"control-plane-machine-set-operator-78cbb6b69f-x796f\" (UID: \"2aec175b-6e2b-4eac-a94f-771881386ffc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.080650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-client-ca\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.082347 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6931a0dd-a354-441f-ae2c-3d6c3e59777e-serving-cert\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.082550 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/affbff56-7e60-4b68-9efa-eb610da84f54-config\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.082615 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2nc6\" (UniqueName: \"kubernetes.io/projected/5e72c156-e23c-414e-a88d-5cef5965f47e-kube-api-access-w2nc6\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.082634 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-csi-data-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.083855 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0d4003-2819-4304-ad4f-0815dd53db79-config\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.084578 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/23b57c5c-dc24-40cd-8563-9e43ddc844dc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.085155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.086017 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2aec175b-6e2b-4eac-a94f-771881386ffc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x796f\" (UID: \"2aec175b-6e2b-4eac-a94f-771881386ffc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.086812 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-tls\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.086918 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-apiservice-cert\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.086947 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-profile-collector-cert\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.087247 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-ca\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.087719 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ec8abb87-07ac-4227-a787-6db444002c0c-signing-key\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088010 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40ad543f-d10f-4d38-bf97-fc1fd668e30f-service-ca-bundle\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088041 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhhnk\" (UniqueName: \"kubernetes.io/projected/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-kube-api-access-dhhnk\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088064 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwvmn\" (UniqueName: \"kubernetes.io/projected/5a24054f-e850-4d6b-b15e-e37115316230-kube-api-access-vwvmn\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088100 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-serving-cert\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088123 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5e72c156-e23c-414e-a88d-5cef5965f47e-node-bootstrap-token\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088171 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cd133d9f-4963-42d4-bd73-f09c510629cc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088200 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-serving-cert\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.088864 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40ad543f-d10f-4d38-bf97-fc1fd668e30f-service-ca-bundle\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.089768 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-trusted-ca\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.089861 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49bf7d38-dc2d-4f9c-9050-202bb1e40747-proxy-tls\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.089904 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjk9w\" (UniqueName: \"kubernetes.io/projected/5d1908a5-f826-4eae-a6fa-c899dda28b57-kube-api-access-qjk9w\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.089931 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5e72c156-e23c-414e-a88d-5cef5965f47e-certs\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.089966 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.090067 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/affbff56-7e60-4b68-9efa-eb610da84f54-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.090184 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-client\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.091335 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23b57c5c-dc24-40cd-8563-9e43ddc844dc-serving-cert\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.094876 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-serving-cert\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.095739 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-encryption-config\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.097649 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-service-ca\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.099790 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05387b29-effd-4fb4-9bd4-21acd9989f76-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.104328 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05387b29-effd-4fb4-9bd4-21acd9989f76-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.105382 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.107382 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-config\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.108339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6931a0dd-a354-441f-ae2c-3d6c3e59777e-etcd-ca\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.110215 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66cj\" (UniqueName: \"kubernetes.io/projected/40ad543f-d10f-4d38-bf97-fc1fd668e30f-kube-api-access-w66cj\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.116340 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-etcd-client\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.116587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cfc908de-22b4-4ca7-a60e-1ed8592f563e-audit-policies\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.116992 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfc908de-22b4-4ca7-a60e-1ed8592f563e-serving-cert\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.120008 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-certificates\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.120387 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-stats-auth\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.120644 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-client-ca\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.120919 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2aec175b-6e2b-4eac-a94f-771881386ffc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x796f\" (UID: \"2aec175b-6e2b-4eac-a94f-771881386ffc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.120949 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/affbff56-7e60-4b68-9efa-eb610da84f54-config\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.123365 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-default-certificate\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.123509 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0d4003-2819-4304-ad4f-0815dd53db79-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.126324 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6931a0dd-a354-441f-ae2c-3d6c3e59777e-serving-cert\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.127826 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40ad543f-d10f-4d38-bf97-fc1fd668e30f-metrics-certs\") pod \"router-default-5444994796-28snx\" (UID: \"40ad543f-d10f-4d38-bf97-fc1fd668e30f\") " pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.128798 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0d4003-2819-4304-ad4f-0815dd53db79-config\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.132168 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea0d4003-2819-4304-ad4f-0815dd53db79-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-brxkd\" (UID: \"ea0d4003-2819-4304-ad4f-0815dd53db79\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.133440 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.133710 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.134180 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/affbff56-7e60-4b68-9efa-eb610da84f54-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.134493 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-tls\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.147253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx76n\" (UniqueName: \"kubernetes.io/projected/cfc908de-22b4-4ca7-a60e-1ed8592f563e-kube-api-access-wx76n\") pod \"apiserver-7bbb656c7d-cmp9v\" (UID: \"cfc908de-22b4-4ca7-a60e-1ed8592f563e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.168854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05387b29-effd-4fb4-9bd4-21acd9989f76-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z2pbk\" (UID: \"05387b29-effd-4fb4-9bd4-21acd9989f76\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.189561 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d986p\" (UniqueName: \"kubernetes.io/projected/23b57c5c-dc24-40cd-8563-9e43ddc844dc-kube-api-access-d986p\") pod \"openshift-config-operator-7777fb866f-9zfdj\" (UID: \"23b57c5c-dc24-40cd-8563-9e43ddc844dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191646 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvcbf\" (UniqueName: \"kubernetes.io/projected/cd133d9f-4963-42d4-bd73-f09c510629cc-kube-api-access-xvcbf\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191709 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2nc6\" (UniqueName: \"kubernetes.io/projected/5e72c156-e23c-414e-a88d-5cef5965f47e-kube-api-access-w2nc6\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191739 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-csi-data-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191774 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-profile-collector-cert\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191799 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-apiservice-cert\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191847 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ec8abb87-07ac-4227-a787-6db444002c0c-signing-key\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191878 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwvmn\" (UniqueName: \"kubernetes.io/projected/5a24054f-e850-4d6b-b15e-e37115316230-kube-api-access-vwvmn\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191901 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhhnk\" (UniqueName: \"kubernetes.io/projected/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-kube-api-access-dhhnk\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191923 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5e72c156-e23c-414e-a88d-5cef5965f47e-node-bootstrap-token\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191946 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cd133d9f-4963-42d4-bd73-f09c510629cc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191973 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjk9w\" (UniqueName: \"kubernetes.io/projected/5d1908a5-f826-4eae-a6fa-c899dda28b57-kube-api-access-qjk9w\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.191997 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49bf7d38-dc2d-4f9c-9050-202bb1e40747-proxy-tls\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192074 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5e72c156-e23c-414e-a88d-5cef5965f47e-certs\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192097 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192129 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b4a7f10-c643-4e7c-b4c9-76bef06ce76d-cert\") pod \"ingress-canary-ftrcl\" (UID: \"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d\") " pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192175 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-proxy-tls\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192200 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8j7\" (UniqueName: \"kubernetes.io/projected/7b4a7f10-c643-4e7c-b4c9-76bef06ce76d-kube-api-access-6b8j7\") pod \"ingress-canary-ftrcl\" (UID: \"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d\") " pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192265 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdbr\" (UniqueName: \"kubernetes.io/projected/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-kube-api-access-pfdbr\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192354 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a24054f-e850-4d6b-b15e-e37115316230-serving-cert\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192379 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a24054f-e850-4d6b-b15e-e37115316230-config\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192401 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192425 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pqv\" (UniqueName: \"kubernetes.io/projected/a7826377-b3f7-47a0-8967-04fa1243de5f-kube-api-access-52pqv\") pod \"multus-admission-controller-857f4d67dd-9v749\" (UID: \"a7826377-b3f7-47a0-8967-04fa1243de5f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192449 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l5hm\" (UniqueName: \"kubernetes.io/projected/6b2da58a-3e24-4e72-a25d-eeee730910cd-kube-api-access-4l5hm\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192471 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-metrics-tls\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192496 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgm7z\" (UniqueName: \"kubernetes.io/projected/49bf7d38-dc2d-4f9c-9050-202bb1e40747-kube-api-access-tgm7z\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192531 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2v98\" (UniqueName: \"kubernetes.io/projected/234512e0-3471-4bd8-b783-6df7b63f2cfe-kube-api-access-s2v98\") pod \"auto-csr-approver-29536212-627nh\" (UID: \"234512e0-3471-4bd8-b783-6df7b63f2cfe\") " pod="openshift-infra/auto-csr-approver-29536212-627nh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192555 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192578 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-mountpoint-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192607 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlc6k\" (UniqueName: \"kubernetes.io/projected/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-kube-api-access-qlc6k\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192633 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-tmpfs\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192656 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-registration-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192677 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-config-volume\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192699 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4zz\" (UniqueName: \"kubernetes.io/projected/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-kube-api-access-7c4zz\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192725 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snc2r\" (UniqueName: \"kubernetes.io/projected/ec8abb87-07ac-4227-a787-6db444002c0c-kube-api-access-snc2r\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192749 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5btl\" (UniqueName: \"kubernetes.io/projected/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-kube-api-access-q5btl\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192776 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-webhook-cert\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192798 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ec8abb87-07ac-4227-a787-6db444002c0c-signing-cabundle\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192820 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5rm\" (UniqueName: \"kubernetes.io/projected/04f492bb-9555-42d8-9e3a-1e419c3d7607-kube-api-access-kq5rm\") pod \"package-server-manager-789f6589d5-497vv\" (UID: \"04f492bb-9555-42d8-9e3a-1e419c3d7607\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192846 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf5jk\" (UniqueName: \"kubernetes.io/projected/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-kube-api-access-tf5jk\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192867 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cd133d9f-4963-42d4-bd73-f09c510629cc-srv-cert\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192882 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-srv-cert\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192901 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192918 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d1908a5-f826-4eae-a6fa-c899dda28b57-secret-volume\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192949 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192966 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49bf7d38-dc2d-4f9c-9050-202bb1e40747-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.192987 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.193011 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7826377-b3f7-47a0-8967-04fa1243de5f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9v749\" (UID: \"a7826377-b3f7-47a0-8967-04fa1243de5f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.193026 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d1908a5-f826-4eae-a6fa-c899dda28b57-config-volume\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.193043 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-socket-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.193062 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f492bb-9555-42d8-9e3a-1e419c3d7607-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-497vv\" (UID: \"04f492bb-9555-42d8-9e3a-1e419c3d7607\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.193066 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-csi-data-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.193078 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-plugins-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.193130 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/49bf7d38-dc2d-4f9c-9050-202bb1e40747-images\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.194961 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/49bf7d38-dc2d-4f9c-9050-202bb1e40747-images\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.195077 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-plugins-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.195446 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a24054f-e850-4d6b-b15e-e37115316230-config\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.195747 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-tmpfs\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.196640 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-mountpoint-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.200125 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ec8abb87-07ac-4227-a787-6db444002c0c-signing-key\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.200789 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.200874 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-socket-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.201263 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-registration-dir\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.201526 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:56.70150869 +0000 UTC m=+215.164129259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.203749 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49bf7d38-dc2d-4f9c-9050-202bb1e40747-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.204732 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.205127 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.205150 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7826377-b3f7-47a0-8967-04fa1243de5f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9v749\" (UID: \"a7826377-b3f7-47a0-8967-04fa1243de5f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.205376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d1908a5-f826-4eae-a6fa-c899dda28b57-secret-volume\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.205725 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ec8abb87-07ac-4227-a787-6db444002c0c-signing-cabundle\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.206063 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.206067 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-metrics-tls\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.206360 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-config-volume\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.206634 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-proxy-tls\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.206883 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a24054f-e850-4d6b-b15e-e37115316230-serving-cert\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.207817 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b4a7f10-c643-4e7c-b4c9-76bef06ce76d-cert\") pod \"ingress-canary-ftrcl\" (UID: \"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d\") " pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.207865 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d1908a5-f826-4eae-a6fa-c899dda28b57-config-volume\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.207974 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-profile-collector-cert\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.208643 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-srv-cert\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.209142 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49bf7d38-dc2d-4f9c-9050-202bb1e40747-proxy-tls\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.209844 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-apiservice-cert\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.211153 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-webhook-cert\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.211579 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cd133d9f-4963-42d4-bd73-f09c510629cc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.211817 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkm57\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-kube-api-access-kkm57\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.211835 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f492bb-9555-42d8-9e3a-1e419c3d7607-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-497vv\" (UID: \"04f492bb-9555-42d8-9e3a-1e419c3d7607\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.211873 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cd133d9f-4963-42d4-bd73-f09c510629cc-srv-cert\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.212038 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5e72c156-e23c-414e-a88d-5cef5965f47e-node-bootstrap-token\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.212577 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5e72c156-e23c-414e-a88d-5cef5965f47e-certs\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.212690 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.236153 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mrg7\" (UniqueName: \"kubernetes.io/projected/2aec175b-6e2b-4eac-a94f-771881386ffc-kube-api-access-7mrg7\") pod \"control-plane-machine-set-operator-78cbb6b69f-x796f\" (UID: \"2aec175b-6e2b-4eac-a94f-771881386ffc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.249538 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-bound-sa-token\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.273402 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.292233 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.294334 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.294537 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:56.794495636 +0000 UTC m=+215.257116205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.294721 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.294962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnnwp\" (UniqueName: \"kubernetes.io/projected/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-kube-api-access-bnnwp\") pod \"route-controller-manager-6576b87f9c-lv6kn\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.295388 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:56.795379591 +0000 UTC m=+215.258000160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.302159 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/affbff56-7e60-4b68-9efa-eb610da84f54-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q4cxd\" (UID: \"affbff56-7e60-4b68-9efa-eb610da84f54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.304374 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn747"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.312571 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf9pq\" (UniqueName: \"kubernetes.io/projected/6931a0dd-a354-441f-ae2c-3d6c3e59777e-kube-api-access-cf9pq\") pod \"etcd-operator-b45778765-mc28q\" (UID: \"6931a0dd-a354-441f-ae2c-3d6c3e59777e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.355461 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8j7\" (UniqueName: \"kubernetes.io/projected/7b4a7f10-c643-4e7c-b4c9-76bef06ce76d-kube-api-access-6b8j7\") pod \"ingress-canary-ftrcl\" (UID: \"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d\") " pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.361444 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.363319 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.364672 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.370766 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvcbf\" (UniqueName: \"kubernetes.io/projected/cd133d9f-4963-42d4-bd73-f09c510629cc-kube-api-access-xvcbf\") pod \"olm-operator-6b444d44fb-rlqln\" (UID: \"cd133d9f-4963-42d4-bd73-f09c510629cc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.372470 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.382737 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.386759 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zbhxj"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.387313 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2nc6\" (UniqueName: \"kubernetes.io/projected/5e72c156-e23c-414e-a88d-5cef5965f47e-kube-api-access-w2nc6\") pod \"machine-config-server-p5m2x\" (UID: \"5e72c156-e23c-414e-a88d-5cef5965f47e\") " pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.390655 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.395846 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r5pzq"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.395922 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.396389 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:56.896371309 +0000 UTC m=+215.358991878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.408142 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.419003 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjk9w\" (UniqueName: \"kubernetes.io/projected/5d1908a5-f826-4eae-a6fa-c899dda28b57-kube-api-access-qjk9w\") pod \"collect-profiles-29536200-5ttzp\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.425621 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5"] Feb 27 06:13:56 crc kubenswrapper[4725]: W0227 06:13:56.432205 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aa28370_7d59_4c1c_aac9_582c6ca3da2f.slice/crio-67ca8a6373b09dbf9ffd8617069f8233a5e3e268c2e2068227c75cee464c386f WatchSource:0}: Error finding container 67ca8a6373b09dbf9ffd8617069f8233a5e3e268c2e2068227c75cee464c386f: Status 404 returned error can't find the container with id 67ca8a6373b09dbf9ffd8617069f8233a5e3e268c2e2068227c75cee464c386f Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.436145 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-89pl9"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.439163 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snc2r\" (UniqueName: \"kubernetes.io/projected/ec8abb87-07ac-4227-a787-6db444002c0c-kube-api-access-snc2r\") pod \"service-ca-9c57cc56f-g7kxx\" (UID: \"ec8abb87-07ac-4227-a787-6db444002c0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: W0227 06:13:56.441550 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod109c9025_3aff_4dc5_a351_c6686c336671.slice/crio-8645afcd697a1b851a54921a4f0135b509a27cc40c79aeac2133cc8cd1da6458 WatchSource:0}: Error finding container 8645afcd697a1b851a54921a4f0135b509a27cc40c79aeac2133cc8cd1da6458: Status 404 returned error can't find the container with id 8645afcd697a1b851a54921a4f0135b509a27cc40c79aeac2133cc8cd1da6458 Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.449969 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.451088 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ndprs"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.460765 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.461807 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdbr\" (UniqueName: \"kubernetes.io/projected/cc7b9cb5-ba37-4857-a03f-c63a32ad6c07-kube-api-access-pfdbr\") pod \"dns-default-pmdnc\" (UID: \"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07\") " pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.478810 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.480498 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2v98\" (UniqueName: \"kubernetes.io/projected/234512e0-3471-4bd8-b783-6df7b63f2cfe-kube-api-access-s2v98\") pod \"auto-csr-approver-29536212-627nh\" (UID: \"234512e0-3471-4bd8-b783-6df7b63f2cfe\") " pod="openshift-infra/auto-csr-approver-29536212-627nh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.493557 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l5hm\" (UniqueName: \"kubernetes.io/projected/6b2da58a-3e24-4e72-a25d-eeee730910cd-kube-api-access-4l5hm\") pod \"marketplace-operator-79b997595-zmwz5\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.497370 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.497793 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:56.997781029 +0000 UTC m=+215.460401598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.510351 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pqv\" (UniqueName: \"kubernetes.io/projected/a7826377-b3f7-47a0-8967-04fa1243de5f-kube-api-access-52pqv\") pod \"multus-admission-controller-857f4d67dd-9v749\" (UID: \"a7826377-b3f7-47a0-8967-04fa1243de5f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.510568 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.513850 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.514534 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.517418 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qr9hb"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.535217 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536212-627nh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.537574 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlc6k\" (UniqueName: \"kubernetes.io/projected/c62ca9d3-ccb1-486e-9371-9f2b71893ec7-kube-api-access-qlc6k\") pod \"kube-storage-version-migrator-operator-b67b599dd-vg2qp\" (UID: \"c62ca9d3-ccb1-486e-9371-9f2b71893ec7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.548132 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgm7z\" (UniqueName: \"kubernetes.io/projected/49bf7d38-dc2d-4f9c-9050-202bb1e40747-kube-api-access-tgm7z\") pod \"machine-config-operator-74547568cd-jmnj2\" (UID: \"49bf7d38-dc2d-4f9c-9050-202bb1e40747\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.569489 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf5jk\" (UniqueName: \"kubernetes.io/projected/2bc010a9-3840-4eaf-922c-9f67e82cb6bf-kube-api-access-tf5jk\") pod \"packageserver-d55dfcdfc-n8wkn\" (UID: \"2bc010a9-3840-4eaf-922c-9f67e82cb6bf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.571086 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pmdnc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.579648 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.579777 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.582857 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ftrcl" Feb 27 06:13:56 crc kubenswrapper[4725]: W0227 06:13:56.584567 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0dcb291_e867_45d5_91f3_fa9b18a090c5.slice/crio-ff74a9c5f338fdc90d580f22fa8610a9dfe2f974661a3cf4fa9cdc50f68abcbe WatchSource:0}: Error finding container ff74a9c5f338fdc90d580f22fa8610a9dfe2f974661a3cf4fa9cdc50f68abcbe: Status 404 returned error can't find the container with id ff74a9c5f338fdc90d580f22fa8610a9dfe2f974661a3cf4fa9cdc50f68abcbe Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.587497 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5rm\" (UniqueName: \"kubernetes.io/projected/04f492bb-9555-42d8-9e3a-1e419c3d7607-kube-api-access-kq5rm\") pod \"package-server-manager-789f6589d5-497vv\" (UID: \"04f492bb-9555-42d8-9e3a-1e419c3d7607\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.589027 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-p5m2x" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.598876 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.599255 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.099195939 +0000 UTC m=+215.561816508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.599751 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.600434 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.100425593 +0000 UTC m=+215.563046162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.620475 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5btl\" (UniqueName: \"kubernetes.io/projected/1aadc96d-d9dd-4848-b4d1-6dc7579002f6-kube-api-access-q5btl\") pod \"csi-hostpathplugin-64j8f\" (UID: \"1aadc96d-d9dd-4848-b4d1-6dc7579002f6\") " pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.634049 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" event={"ID":"96ae065a-f626-466b-a56e-089250cf7405","Type":"ContainerStarted","Data":"4f3b93ac7c55b91b1b06930d0f9931ee94d172b7d07de26e7f559f2ee412d7a3"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.636257 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" event={"ID":"9904efe3-73fc-4efd-ac2c-39a0653315ba","Type":"ContainerStarted","Data":"2ebbfd941f90c85993a7f3317c93012411a8779f93bd4f3212b091a0b887ab0e"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.636278 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" event={"ID":"9904efe3-73fc-4efd-ac2c-39a0653315ba","Type":"ContainerStarted","Data":"5847d7d8d9127421a31848fec985345a8d946c3500da3c31256742a54c85c54a"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.640631 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4zz\" (UniqueName: \"kubernetes.io/projected/cdf64ed0-6837-4ef7-b2c5-2e5880e1385b-kube-api-access-7c4zz\") pod \"machine-config-controller-84d6567774-fgnvc\" (UID: \"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.644665 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-28snx" event={"ID":"40ad543f-d10f-4d38-bf97-fc1fd668e30f","Type":"ContainerStarted","Data":"2a7e33193cff4fe21ce151e34229d58dca95e95c9f5ca7092572399b4bad8e75"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.646833 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" event={"ID":"fb380083-e346-435c-8290-ab59f1a1d190","Type":"ContainerStarted","Data":"bc7ed8617657ee80eca279f00d35a4a46ff3a3c2cdb19068bcabd15c7dfd517a"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.648473 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" event={"ID":"1002e916-f907-4ae4-bc0f-3a08f4b70f5f","Type":"ContainerStarted","Data":"f1486d1141c78496ead3217ed3ca0b4f3dff0aeb0aa2623445a980a04a9aea6b"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.650923 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" event={"ID":"5aa28370-7d59-4c1c-aac9-582c6ca3da2f","Type":"ContainerStarted","Data":"67ca8a6373b09dbf9ffd8617069f8233a5e3e268c2e2068227c75cee464c386f"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.654857 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhhnk\" (UniqueName: \"kubernetes.io/projected/8aa612f1-e5b4-4f52-93d9-c52cedd7740e-kube-api-access-dhhnk\") pod \"catalog-operator-68c6474976-7tszh\" (UID: \"8aa612f1-e5b4-4f52-93d9-c52cedd7740e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.655827 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" event={"ID":"89db0909-19ba-4b2f-adaf-9bc42c9efd1d","Type":"ContainerStarted","Data":"c3c7cb8b86ecc7537b4eaa1076e01dc8e2dce25548a5877d54d4ef027d162fc4"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.656095 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" event={"ID":"89db0909-19ba-4b2f-adaf-9bc42c9efd1d","Type":"ContainerStarted","Data":"dfac07a2e4d602f850287756f66d0573132c9b50b561229088155f51be309d0c"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.657096 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b58nc" event={"ID":"6e178501-6e8d-4fcd-abf4-b13cd287501b","Type":"ContainerStarted","Data":"b291f97962b8f2f35d8b3e95501c2c0f65d8ff29f00f2ec266eafe73628701c7"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.657153 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b58nc" event={"ID":"6e178501-6e8d-4fcd-abf4-b13cd287501b","Type":"ContainerStarted","Data":"0346b96c2cc2fa352fd8c08612abb55b132ca48da13af391c1a0dadada1e22d3"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.657376 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-b58nc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.657752 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" event={"ID":"109c9025-3aff-4dc5-a351-c6686c336671","Type":"ContainerStarted","Data":"8645afcd697a1b851a54921a4f0135b509a27cc40c79aeac2133cc8cd1da6458"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.672362 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-b58nc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.672411 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b58nc" podUID="6e178501-6e8d-4fcd-abf4-b13cd287501b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.673883 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" event={"ID":"eda518bc-5412-413a-8958-ea97c24a9795","Type":"ContainerStarted","Data":"4ba6fd90af4e33eb5bb36918010e39461cd9388d94a056b03542882de1cf251f"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.675350 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" event={"ID":"e0a18ecb-f59a-412e-b224-0bdbd115bd90","Type":"ContainerStarted","Data":"2df17f0d3d61a2a2c63d302472136cf3eb27da9b4982a9218bcc239304a501fa"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.676964 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" event={"ID":"aeead09f-faab-4a7f-8aed-7a7823d350bc","Type":"ContainerStarted","Data":"a29b38a32557c6149f42725a49f4a7ce00cd183f52b8d801bdbe5100e6fde4c5"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.676992 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" event={"ID":"aeead09f-faab-4a7f-8aed-7a7823d350bc","Type":"ContainerStarted","Data":"f70991308c3de9e489864c86b8261a1cb28b23f5b65d616618a5e1897835e575"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.683793 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qr9hb" event={"ID":"b0dcb291-e867-45d5-91f3-fa9b18a090c5","Type":"ContainerStarted","Data":"ff74a9c5f338fdc90d580f22fa8610a9dfe2f974661a3cf4fa9cdc50f68abcbe"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.684704 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" event={"ID":"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780","Type":"ContainerStarted","Data":"a62ecf522dc5def01647f52e9a50357ef7076d7f60536bb210302e95a063367c"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.685851 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" event={"ID":"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d","Type":"ContainerStarted","Data":"aba09c7075131eb01bac366f03b8c1b1730d6a18dcd35a02b646e91abda55b76"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.688065 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" event={"ID":"711e69a7-689f-47d6-840e-90ca3779ce5a","Type":"ContainerStarted","Data":"d12aba1cac11945537cf2682574ac3052772743132920086992f135bd5d880e6"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.688113 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" event={"ID":"711e69a7-689f-47d6-840e-90ca3779ce5a","Type":"ContainerStarted","Data":"8e6f210c9d74a00d394e8165fbd165c31d7675d1ba927d9de2585934a088c95e"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.694506 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwvmn\" (UniqueName: \"kubernetes.io/projected/5a24054f-e850-4d6b-b15e-e37115316230-kube-api-access-vwvmn\") pod \"service-ca-operator-777779d784-6lpfn\" (UID: \"5a24054f-e850-4d6b-b15e-e37115316230\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: W0227 06:13:56.702801 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfc908de_22b4_4ca7_a60e_1ed8592f563e.slice/crio-a4ab77266b4f1944382822cdcfd485d3e8d24ec378fe3137bb4fd1b9652bbfdb WatchSource:0}: Error finding container a4ab77266b4f1944382822cdcfd485d3e8d24ec378fe3137bb4fd1b9652bbfdb: Status 404 returned error can't find the container with id a4ab77266b4f1944382822cdcfd485d3e8d24ec378fe3137bb4fd1b9652bbfdb Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.703886 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.706445 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.206364348 +0000 UTC m=+215.668984917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.703352 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" event={"ID":"682c856f-0661-4039-b071-e5c75267f3f1","Type":"ContainerStarted","Data":"d65de6068935dfb3e4cf9e18d25d20f7bc82dd0228e6575d9d715c152a904ba6"} Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.716814 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.725898 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.743623 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.752556 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.756932 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.773005 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.777832 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.784096 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.787624 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.798831 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.809364 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.811106 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.311069369 +0000 UTC m=+215.773690038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.843831 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.853719 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.857675 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.904836 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd"] Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.911614 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.911767 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.411748849 +0000 UTC m=+215.874369418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.911988 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:56 crc kubenswrapper[4725]: E0227 06:13:56.912306 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.412299204 +0000 UTC m=+215.874919773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:56 crc kubenswrapper[4725]: W0227 06:13:56.938597 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0d4003_2819_4304_ad4f_0815dd53db79.slice/crio-e23114e9a0fef9019c409cda56aeb9fadbcfab81fbb00ae54bb3d0a43ec17397 WatchSource:0}: Error finding container e23114e9a0fef9019c409cda56aeb9fadbcfab81fbb00ae54bb3d0a43ec17397: Status 404 returned error can't find the container with id e23114e9a0fef9019c409cda56aeb9fadbcfab81fbb00ae54bb3d0a43ec17397 Feb 27 06:13:56 crc kubenswrapper[4725]: I0227 06:13:56.971317 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.000491 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.014400 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.015399 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.51537943 +0000 UTC m=+215.977999999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: W0227 06:13:57.048538 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b57c5c_dc24_40cd_8563_9e43ddc844dc.slice/crio-43415d23fbbac14caded611df61123d5787ab37c7478b9e871f59fcce85c804c WatchSource:0}: Error finding container 43415d23fbbac14caded611df61123d5787ab37c7478b9e871f59fcce85c804c: Status 404 returned error can't find the container with id 43415d23fbbac14caded611df61123d5787ab37c7478b9e871f59fcce85c804c Feb 27 06:13:57 crc kubenswrapper[4725]: W0227 06:13:57.083072 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdfdd960_ccbf_4554_b737_aa1c1f1e6572.slice/crio-0603ee1a84d64793a72106d3cbc363cb44c27597d29e31e4363cf613045a534a WatchSource:0}: Error finding container 0603ee1a84d64793a72106d3cbc363cb44c27597d29e31e4363cf613045a534a: Status 404 returned error can't find the container with id 0603ee1a84d64793a72106d3cbc363cb44c27597d29e31e4363cf613045a534a Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.117532 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.117832 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.617816398 +0000 UTC m=+216.080436967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.147168 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mc28q"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.218057 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.218592 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.218637 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.218656 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.221837 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.72180931 +0000 UTC m=+216.184429879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.222661 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.248806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.251846 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.276365 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:13:57 crc kubenswrapper[4725]: W0227 06:13:57.276529 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e72c156_e23c_414e_a88d_5cef5965f47e.slice/crio-ec61d419782e7edb64a559744de657c0bf8df55eee062790c84d2c2978a5967c WatchSource:0}: Error finding container ec61d419782e7edb64a559744de657c0bf8df55eee062790c84d2c2978a5967c: Status 404 returned error can't find the container with id ec61d419782e7edb64a559744de657c0bf8df55eee062790c84d2c2978a5967c Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.321931 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.321986 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.323456 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.823438485 +0000 UTC m=+216.286059114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.333778 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp"] Feb 27 06:13:57 crc kubenswrapper[4725]: W0227 06:13:57.336338 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6931a0dd_a354_441f_ae2c_3d6c3e59777e.slice/crio-20b32157078a19b8d00edcf95013faaf618798ddbd2ca6f817fd943622f09fc5 WatchSource:0}: Error finding container 20b32157078a19b8d00edcf95013faaf618798ddbd2ca6f817fd943622f09fc5: Status 404 returned error can't find the container with id 20b32157078a19b8d00edcf95013faaf618798ddbd2ca6f817fd943622f09fc5 Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.340386 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.345778 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pmdnc"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.348180 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.383515 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-g7kxx"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.422623 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.422947 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:57.922933142 +0000 UTC m=+216.385553711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.475994 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.489080 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.515949 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" podStartSLOduration=153.515930739 podStartE2EDuration="2m33.515930739s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:57.512276498 +0000 UTC m=+215.974897067" watchObservedRunningTime="2026-02-27 06:13:57.515930739 +0000 UTC m=+215.978551308" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.526221 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.526608 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.026590724 +0000 UTC m=+216.489211293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.576925 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.629379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.629563 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.129534387 +0000 UTC m=+216.592154956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.630346 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.635237 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.134815283 +0000 UTC m=+216.597435962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.701296 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.734115 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.734401 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.234376842 +0000 UTC m=+216.696997411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.734632 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.735029 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.23502218 +0000 UTC m=+216.697642749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.766570 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ftrcl"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.767509 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536212-627nh"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.805263 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9v749"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.833278 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-b58nc" podStartSLOduration=152.833258661 podStartE2EDuration="2m32.833258661s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:57.827842561 +0000 UTC m=+216.290463150" watchObservedRunningTime="2026-02-27 06:13:57.833258661 +0000 UTC m=+216.295879230" Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.840035 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.840481 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.340463811 +0000 UTC m=+216.803084370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.885985 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.941229 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:57 crc kubenswrapper[4725]: E0227 06:13:57.941591 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.441578603 +0000 UTC m=+216.904199172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:57 crc kubenswrapper[4725]: W0227 06:13:57.942347 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf64ed0_6837_4ef7_b2c5_2e5880e1385b.slice/crio-c92209934ea2c3b5406405ae0a91c5e3433f10c8c82f65b839cb82b95159f5c7 WatchSource:0}: Error finding container c92209934ea2c3b5406405ae0a91c5e3433f10c8c82f65b839cb82b95159f5c7: Status 404 returned error can't find the container with id c92209934ea2c3b5406405ae0a91c5e3433f10c8c82f65b839cb82b95159f5c7 Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.956041 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmwz5"] Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.966876 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" event={"ID":"682c856f-0661-4039-b071-e5c75267f3f1","Type":"ContainerStarted","Data":"1b575e981d2e10c0689990b4e43a1432b44fa9002fb14f5a2381191827b5326b"} Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.981185 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" event={"ID":"fb380083-e346-435c-8290-ab59f1a1d190","Type":"ContainerStarted","Data":"640777beb7725c1fed078da879ea55e78ab955e7ce878fab7e8f08c1bf522797"} Feb 27 06:13:57 crc kubenswrapper[4725]: W0227 06:13:57.988397 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod234512e0_3471_4bd8_b783_6df7b63f2cfe.slice/crio-c968e56d6f45f123d0e6f236b3cbbc2384fe6d4f94c88aed5a46791a9d4bfcc8 WatchSource:0}: Error finding container c968e56d6f45f123d0e6f236b3cbbc2384fe6d4f94c88aed5a46791a9d4bfcc8: Status 404 returned error can't find the container with id c968e56d6f45f123d0e6f236b3cbbc2384fe6d4f94c88aed5a46791a9d4bfcc8 Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.990968 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" event={"ID":"89db0909-19ba-4b2f-adaf-9bc42c9efd1d","Type":"ContainerStarted","Data":"fef71eca9a2e6c249f1d68cc08b47a7b68c3b58732f52ecb68d43f3f4268d8b1"} Feb 27 06:13:57 crc kubenswrapper[4725]: I0227 06:13:57.991014 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.005372 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" event={"ID":"05387b29-effd-4fb4-9bd4-21acd9989f76","Type":"ContainerStarted","Data":"fab89c768fa83cd20cd88937a26384bde220696089612dd61f0ec39e90d20100"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.041278 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64j8f"] Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.042031 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.046993 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.546968593 +0000 UTC m=+217.009589162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.085780 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" event={"ID":"ea0d4003-2819-4304-ad4f-0815dd53db79","Type":"ContainerStarted","Data":"e23114e9a0fef9019c409cda56aeb9fadbcfab81fbb00ae54bb3d0a43ec17397"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.094217 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" event={"ID":"5aa28370-7d59-4c1c-aac9-582c6ca3da2f","Type":"ContainerStarted","Data":"d4ec7ff119371860b6d8c46676c1aa707559ec67ed38f07cdb75a606dac14e8b"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.094833 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.102747 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" event={"ID":"96ae065a-f626-466b-a56e-089250cf7405","Type":"ContainerStarted","Data":"8223340b6f7b94cb611269e733942c4e417c810d395266efc73efda805815766"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.105646 4725 patch_prober.go:28] interesting pod/console-operator-58897d9998-r5pzq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.105696 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" podUID="5aa28370-7d59-4c1c-aac9-582c6ca3da2f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.128089 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn"] Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.142432 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" event={"ID":"eda518bc-5412-413a-8958-ea97c24a9795","Type":"ContainerStarted","Data":"f55ec42995860ce348d166b7e2a2e0abcb0039daeaa8b22c469cbe5b04c58e80"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.142951 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.143596 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.143847 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.643835697 +0000 UTC m=+217.106456266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.150531 4725 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bn747 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.150632 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" podUID="eda518bc-5412-413a-8958-ea97c24a9795" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.155696 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" event={"ID":"23b57c5c-dc24-40cd-8563-9e43ddc844dc","Type":"ContainerStarted","Data":"43415d23fbbac14caded611df61123d5787ab37c7478b9e871f59fcce85c804c"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.164740 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" event={"ID":"9904efe3-73fc-4efd-ac2c-39a0653315ba","Type":"ContainerStarted","Data":"a3261de48c514e4de37f8d141a2cb2b2cee9fe129210da6a1e81547705d9395d"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.166005 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" event={"ID":"cdfdd960-ccbf-4554-b737-aa1c1f1e6572","Type":"ContainerStarted","Data":"0603ee1a84d64793a72106d3cbc363cb44c27597d29e31e4363cf613045a534a"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.174826 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pmdnc" event={"ID":"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07","Type":"ContainerStarted","Data":"9b38252e34a204f98bea93f023b8667efdc5ea94fe07d201c85f9a4e11b57cac"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.206561 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qr9hb" event={"ID":"b0dcb291-e867-45d5-91f3-fa9b18a090c5","Type":"ContainerStarted","Data":"413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.210859 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" event={"ID":"5d1908a5-f826-4eae-a6fa-c899dda28b57","Type":"ContainerStarted","Data":"6a71e22eb325708ee42279eec4cbbad7ffc73096e7d1b79b104749db7b4168d7"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.224189 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" event={"ID":"affbff56-7e60-4b68-9efa-eb610da84f54","Type":"ContainerStarted","Data":"525ee3a31219db95bb4b285dccbbc5389ecde6112bffb9fb1b691b56f74e2da1"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.234258 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mvds5" podStartSLOduration=154.234236572 podStartE2EDuration="2m34.234236572s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.230671753 +0000 UTC m=+216.693292332" watchObservedRunningTime="2026-02-27 06:13:58.234236572 +0000 UTC m=+216.696857141" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.234678 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-l686v" podStartSLOduration=154.234673594 podStartE2EDuration="2m34.234673594s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.188913466 +0000 UTC m=+216.651534055" watchObservedRunningTime="2026-02-27 06:13:58.234673594 +0000 UTC m=+216.697294163" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.245360 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.245647 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.745612907 +0000 UTC m=+217.208233466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.246067 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.250492 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.750448771 +0000 UTC m=+217.213069530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.305399 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zclkw" podStartSLOduration=154.305377003 podStartE2EDuration="2m34.305377003s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.301910537 +0000 UTC m=+216.764531116" watchObservedRunningTime="2026-02-27 06:13:58.305377003 +0000 UTC m=+216.767997572" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.336207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" event={"ID":"109c9025-3aff-4dc5-a351-c6686c336671","Type":"ContainerStarted","Data":"0e70f639ff1aacc14f533a3807596ae90113f91015d917a0befb05c60bb52140"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.336253 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" event={"ID":"6931a0dd-a354-441f-ae2c-3d6c3e59777e","Type":"ContainerStarted","Data":"20b32157078a19b8d00edcf95013faaf618798ddbd2ca6f817fd943622f09fc5"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.336644 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-28snx" event={"ID":"40ad543f-d10f-4d38-bf97-fc1fd668e30f","Type":"ContainerStarted","Data":"2129b6b8ef91c23707ce0883be1899823ca8aaf9adca80d61d4ec2171ddc8e2d"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.347223 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.347638 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.847614473 +0000 UTC m=+217.310235042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.348357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.351397 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.851382708 +0000 UTC m=+217.314003287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.357662 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" podStartSLOduration=154.357631991 podStartE2EDuration="2m34.357631991s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.351812459 +0000 UTC m=+216.814433028" watchObservedRunningTime="2026-02-27 06:13:58.357631991 +0000 UTC m=+216.820252560" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.359467 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p5m2x" event={"ID":"5e72c156-e23c-414e-a88d-5cef5965f47e","Type":"ContainerStarted","Data":"ec61d419782e7edb64a559744de657c0bf8df55eee062790c84d2c2978a5967c"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.368616 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.373557 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.373676 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.377980 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" event={"ID":"cfc908de-22b4-4ca7-a60e-1ed8592f563e","Type":"ContainerStarted","Data":"a4ab77266b4f1944382822cdcfd485d3e8d24ec378fe3137bb4fd1b9652bbfdb"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.390900 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" event={"ID":"3c16e75d-2063-4bb9-ba56-a7ac4f9fa24d","Type":"ContainerStarted","Data":"b9285520c54068c7b1f42f5f92f98a11f102d59ad5914fb9f9e13e375416bf58"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.395454 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" event={"ID":"2aec175b-6e2b-4eac-a94f-771881386ffc","Type":"ContainerStarted","Data":"dc26fc48d3760f894878906227efe011dd76abffed18758975929493ab4ced45"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.397246 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgb8c" podStartSLOduration=154.397197976 podStartE2EDuration="2m34.397197976s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.38509415 +0000 UTC m=+216.847714719" watchObservedRunningTime="2026-02-27 06:13:58.397197976 +0000 UTC m=+216.859818545" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.417399 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2"] Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.421754 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv"] Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.422342 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" event={"ID":"e0a18ecb-f59a-412e-b224-0bdbd115bd90","Type":"ContainerStarted","Data":"02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.424215 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.430177 4725 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zbhxj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.430217 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" podUID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.436960 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn"] Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.442101 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" podStartSLOduration=153.442078699 podStartE2EDuration="2m33.442078699s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.42981424 +0000 UTC m=+216.892434809" watchObservedRunningTime="2026-02-27 06:13:58.442078699 +0000 UTC m=+216.904699278" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.449751 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.450124 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:58.950105602 +0000 UTC m=+217.412726171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.450427 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" event={"ID":"cd133d9f-4963-42d4-bd73-f09c510629cc","Type":"ContainerStarted","Data":"d4d17a21c05a6825775a3e13172eadb66c17d1050560a0d3fc852f509fdba9e9"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.485270 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" event={"ID":"1002e916-f907-4ae4-bc0f-3a08f4b70f5f","Type":"ContainerStarted","Data":"11181c0d1d5959aa51550230ed357699309c773440ec2ca1071e78db4b6167c0"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.498454 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" event={"ID":"ec8abb87-07ac-4227-a787-6db444002c0c","Type":"ContainerStarted","Data":"0920287885a39c51e68a0267cc309542df290676ce8f80e644f8c5db582e3065"} Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.500608 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-b58nc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.500682 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b58nc" podUID="6e178501-6e8d-4fcd-abf4-b13cd287501b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.522444 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7cjxg" podStartSLOduration=153.522423476 podStartE2EDuration="2m33.522423476s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.476978477 +0000 UTC m=+216.939599066" watchObservedRunningTime="2026-02-27 06:13:58.522423476 +0000 UTC m=+216.985044045" Feb 27 06:13:58 crc kubenswrapper[4725]: W0227 06:13:58.526577 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-07b17eae706984c4cdbb1ee35cd15227e7d3ede82f52c74b7372ca2240ae2844 WatchSource:0}: Error finding container 07b17eae706984c4cdbb1ee35cd15227e7d3ede82f52c74b7372ca2240ae2844: Status 404 returned error can't find the container with id 07b17eae706984c4cdbb1ee35cd15227e7d3ede82f52c74b7372ca2240ae2844 Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.531021 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" podStartSLOduration=154.531000273 podStartE2EDuration="2m34.531000273s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.519389262 +0000 UTC m=+216.982009831" watchObservedRunningTime="2026-02-27 06:13:58.531000273 +0000 UTC m=+216.993620842" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.551591 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.552727 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-qr9hb" podStartSLOduration=153.552697864 podStartE2EDuration="2m33.552697864s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.550806322 +0000 UTC m=+217.013426891" watchObservedRunningTime="2026-02-27 06:13:58.552697864 +0000 UTC m=+217.015318433" Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.554916 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.054898465 +0000 UTC m=+217.517519034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.595970 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" podStartSLOduration=153.595937973 podStartE2EDuration="2m33.595937973s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.588897417 +0000 UTC m=+217.051517986" watchObservedRunningTime="2026-02-27 06:13:58.595937973 +0000 UTC m=+217.058558542" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.604128 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54254: no serving certificate available for the kubelet" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.631669 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-glpkv" podStartSLOduration=153.631653082 podStartE2EDuration="2m33.631653082s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.63083848 +0000 UTC m=+217.093459059" watchObservedRunningTime="2026-02-27 06:13:58.631653082 +0000 UTC m=+217.094273651" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.653837 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.654011 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.153984891 +0000 UTC m=+217.616605460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.654221 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.664549 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.164532763 +0000 UTC m=+217.627153332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.696706 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-28snx" podStartSLOduration=153.696680434 podStartE2EDuration="2m33.696680434s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:58.674090908 +0000 UTC m=+217.136711477" watchObservedRunningTime="2026-02-27 06:13:58.696680434 +0000 UTC m=+217.159301003" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.706233 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54262: no serving certificate available for the kubelet" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.757962 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.758389 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.258368903 +0000 UTC m=+217.720989472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.835647 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54266: no serving certificate available for the kubelet" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.861624 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.862144 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.362128038 +0000 UTC m=+217.824748607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.929017 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54282: no serving certificate available for the kubelet" Feb 27 06:13:58 crc kubenswrapper[4725]: I0227 06:13:58.963777 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:58 crc kubenswrapper[4725]: E0227 06:13:58.964732 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.46470964 +0000 UTC m=+217.927330209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.066786 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.074324 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54294: no serving certificate available for the kubelet" Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.074806 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.57478297 +0000 UTC m=+218.037403539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.168053 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.168448 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.668432695 +0000 UTC m=+218.131053264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: W0227 06:13:59.265155 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-96004613afb7b73cd8d808bc57886f437697bb0757d5eccde844a64199d3bc59 WatchSource:0}: Error finding container 96004613afb7b73cd8d808bc57886f437697bb0757d5eccde844a64199d3bc59: Status 404 returned error can't find the container with id 96004613afb7b73cd8d808bc57886f437697bb0757d5eccde844a64199d3bc59 Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.269609 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.269979 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.769962738 +0000 UTC m=+218.232583307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.307696 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54310: no serving certificate available for the kubelet" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.371078 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.371359 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.871344397 +0000 UTC m=+218.333964966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.381476 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:13:59 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:13:59 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:13:59 crc kubenswrapper[4725]: healthz check failed Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.381538 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.472522 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.472813 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:13:59.972801598 +0000 UTC m=+218.435422167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.525077 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.525434 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.561348 4725 patch_prober.go:28] interesting pod/apiserver-76f77b778f-27d6j container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]log ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]etcd ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/generic-apiserver-start-informers ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/max-in-flight-filter ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 27 06:13:59 crc kubenswrapper[4725]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 27 06:13:59 crc kubenswrapper[4725]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/project.openshift.io-projectcache ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/openshift.io-startinformers ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 27 06:13:59 crc kubenswrapper[4725]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 27 06:13:59 crc kubenswrapper[4725]: livez check failed Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.561407 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" podUID="711e69a7-689f-47d6-840e-90ca3779ce5a" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.567827 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54322: no serving certificate available for the kubelet" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.573543 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ftrcl" event={"ID":"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d","Type":"ContainerStarted","Data":"5fff9207e46bba7ec1c9af7dee863fe2bda295a63604d9c99b54d4c31e7bc7bb"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.580240 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.580579 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.080560214 +0000 UTC m=+218.543180783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.584486 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x796f" event={"ID":"2aec175b-6e2b-4eac-a94f-771881386ffc","Type":"ContainerStarted","Data":"65a27e2bae79a41ab03240b9cf9aa33ceb6bd0fa7ca181c2473ddc4a9a63098e"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.646878 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" event={"ID":"c62ca9d3-ccb1-486e-9371-9f2b71893ec7","Type":"ContainerStarted","Data":"2559f8695657e641d458c000b72de04ee978d7a35bd9f04f7da09dfde5cf4b68"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.681579 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.688912 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.188895295 +0000 UTC m=+218.651515864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.711208 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" event={"ID":"8aa612f1-e5b4-4f52-93d9-c52cedd7740e","Type":"ContainerStarted","Data":"4e22921c07df4a80d84899cc824e7f4360cb77b147210400197df26e68618a86"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.711784 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.723557 4725 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7tszh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.723630 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" podUID="8aa612f1-e5b4-4f52-93d9-c52cedd7740e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.759695 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" podStartSLOduration=154.759681467 podStartE2EDuration="2m34.759681467s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:59.757694862 +0000 UTC m=+218.220315431" watchObservedRunningTime="2026-02-27 06:13:59.759681467 +0000 UTC m=+218.222302036" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.778370 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"490f9495b394bbb246cc041107b167e85194d8f039262ee288ac44360108f5a9"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.787926 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.788266 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.288252938 +0000 UTC m=+218.750873507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.844090 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" event={"ID":"affbff56-7e60-4b68-9efa-eb610da84f54","Type":"ContainerStarted","Data":"e7c5da72b03b2c500bc9fb454f4ee8308bd25aeeed6d1b943d54fd8a9c47482f"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.880901 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" event={"ID":"6b2da58a-3e24-4e72-a25d-eeee730910cd","Type":"ContainerStarted","Data":"d52afecf7df04841133babf90ab4d4798d3e09681c6f2f8a09036b464c298397"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.889639 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.891185 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.39116947 +0000 UTC m=+218.853790039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.897966 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4cxd" podStartSLOduration=154.897946858 podStartE2EDuration="2m34.897946858s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:13:59.88683478 +0000 UTC m=+218.349455349" watchObservedRunningTime="2026-02-27 06:13:59.897946858 +0000 UTC m=+218.360567427" Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.899162 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn747"] Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.922646 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" event={"ID":"49bf7d38-dc2d-4f9c-9050-202bb1e40747","Type":"ContainerStarted","Data":"ee83903208ad1535072da44f16e4e1d0bc8850f1ac993b0309fad7dba3e8816e"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.960644 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" event={"ID":"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780","Type":"ContainerStarted","Data":"55d426323bcb12403af558d1e98ed36a16d13e4d8a980e34f3bcfb235fe05c15"} Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.990904 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.992112 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn"] Feb 27 06:13:59 crc kubenswrapper[4725]: E0227 06:13:59.992184 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.492167528 +0000 UTC m=+218.954788097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:13:59 crc kubenswrapper[4725]: I0227 06:13:59.992541 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" event={"ID":"1aadc96d-d9dd-4848-b4d1-6dc7579002f6","Type":"ContainerStarted","Data":"b4ada0fb7ebb8a45798d7d780d94705bbd44af82b9fa349ce5794b0e3a568270"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.013878 4725 generic.go:334] "Generic (PLEG): container finished" podID="cfc908de-22b4-4ca7-a60e-1ed8592f563e" containerID="3ec345f3ea6979c6b673c18b06c4a21dda9a0730d62fc879a8092433398fdcc4" exitCode=0 Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.013946 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" event={"ID":"cfc908de-22b4-4ca7-a60e-1ed8592f563e","Type":"ContainerDied","Data":"3ec345f3ea6979c6b673c18b06c4a21dda9a0730d62fc879a8092433398fdcc4"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.022011 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54330: no serving certificate available for the kubelet" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.029713 4725 generic.go:334] "Generic (PLEG): container finished" podID="23b57c5c-dc24-40cd-8563-9e43ddc844dc" containerID="d07d3bf38da6ff1c9d130a1dbc7cc17929c54c0030e1e423214f857276561d67" exitCode=0 Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.029849 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" event={"ID":"23b57c5c-dc24-40cd-8563-9e43ddc844dc","Type":"ContainerDied","Data":"d07d3bf38da6ff1c9d130a1dbc7cc17929c54c0030e1e423214f857276561d67"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.074747 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"07b17eae706984c4cdbb1ee35cd15227e7d3ede82f52c74b7372ca2240ae2844"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.092921 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.093448 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.593430554 +0000 UTC m=+219.056051123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.098993 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" event={"ID":"2bc010a9-3840-4eaf-922c-9f67e82cb6bf","Type":"ContainerStarted","Data":"11c283722a263482176f1c48eaacb19b18a3540b0685dd0b148fbac87bb9e300"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.116399 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" event={"ID":"682c856f-0661-4039-b071-e5c75267f3f1","Type":"ContainerStarted","Data":"c7ca267c78f6a6a45ceffd9fbcad7712108ba4d0c97c14672d5dc6f2323d6fd8"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.148768 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" event={"ID":"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b","Type":"ContainerStarted","Data":"c92209934ea2c3b5406405ae0a91c5e3433f10c8c82f65b839cb82b95159f5c7"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.158468 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" event={"ID":"cdfdd960-ccbf-4554-b737-aa1c1f1e6572","Type":"ContainerStarted","Data":"1f5d92e9b0277867ae34183c7de3521ce3294e87b271aa61fe81ad633be0ca61"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.158549 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.192174 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-89pl9" podStartSLOduration=155.192151159 podStartE2EDuration="2m35.192151159s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.181137864 +0000 UTC m=+218.643758433" watchObservedRunningTime="2026-02-27 06:14:00.192151159 +0000 UTC m=+218.654771728" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.206612 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.208084 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.70806038 +0000 UTC m=+219.170680949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.217261 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536214-p7k5h"] Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.227646 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536214-p7k5h"] Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.227771 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.229005 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" event={"ID":"04f492bb-9555-42d8-9e3a-1e419c3d7607","Type":"ContainerStarted","Data":"f0793e4f2eb02cf6c00e5710f80d2dd4e10eb80aeef53b7cff53afed1c937854"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.240709 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.274529 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" event={"ID":"5a24054f-e850-4d6b-b15e-e37115316230","Type":"ContainerStarted","Data":"c5cb78b42a4277e7a026b08962b455504255b87d482c988072fc318193174e70"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.291807 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" podStartSLOduration=155.29178799 podStartE2EDuration="2m35.29178799s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.240362315 +0000 UTC m=+218.702982894" watchObservedRunningTime="2026-02-27 06:14:00.29178799 +0000 UTC m=+218.754408559" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.296702 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"96004613afb7b73cd8d808bc57886f437697bb0757d5eccde844a64199d3bc59"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.308960 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" event={"ID":"cd133d9f-4963-42d4-bd73-f09c510629cc","Type":"ContainerStarted","Data":"67c862c1cf79480f5e58a88fbb210c901792fdbe66ebb0a8870263d923252d56"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.309139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.309186 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnb4\" (UniqueName: \"kubernetes.io/projected/06c0abcb-fb62-4f62-b73e-a27620de9add-kube-api-access-8nnb4\") pod \"auto-csr-approver-29536214-p7k5h\" (UID: \"06c0abcb-fb62-4f62-b73e-a27620de9add\") " pod="openshift-infra/auto-csr-approver-29536214-p7k5h" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.309924 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.309933 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.809913012 +0000 UTC m=+219.272533581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.352529 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.362207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" event={"ID":"a7826377-b3f7-47a0-8967-04fa1243de5f","Type":"ContainerStarted","Data":"f0ae7f5cf7b2f0b263061b5369d6baf24d18d8af254f7ecb40074bc168411022"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.377421 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:00 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:14:00 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:00 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.377486 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.386596 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.406153 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536212-627nh" event={"ID":"234512e0-3471-4bd8-b783-6df7b63f2cfe","Type":"ContainerStarted","Data":"c968e56d6f45f123d0e6f236b3cbbc2384fe6d4f94c88aed5a46791a9d4bfcc8"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.418014 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.418518 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnb4\" (UniqueName: \"kubernetes.io/projected/06c0abcb-fb62-4f62-b73e-a27620de9add-kube-api-access-8nnb4\") pod \"auto-csr-approver-29536214-p7k5h\" (UID: \"06c0abcb-fb62-4f62-b73e-a27620de9add\") " pod="openshift-infra/auto-csr-approver-29536214-p7k5h" Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.418913 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:00.918898512 +0000 UTC m=+219.381519081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.438522 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" podStartSLOduration=155.438504675 podStartE2EDuration="2m35.438504675s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.344216443 +0000 UTC m=+218.806837042" watchObservedRunningTime="2026-02-27 06:14:00.438504675 +0000 UTC m=+218.901125244" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.470698 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" event={"ID":"05387b29-effd-4fb4-9bd4-21acd9989f76","Type":"ContainerStarted","Data":"602a49ec0495b3ce10b396db3b3df65e5309d45a25aca1aa05fa51bddee19677"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.495856 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnb4\" (UniqueName: \"kubernetes.io/projected/06c0abcb-fb62-4f62-b73e-a27620de9add-kube-api-access-8nnb4\") pod \"auto-csr-approver-29536214-p7k5h\" (UID: \"06c0abcb-fb62-4f62-b73e-a27620de9add\") " pod="openshift-infra/auto-csr-approver-29536214-p7k5h" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.505255 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" event={"ID":"109c9025-3aff-4dc5-a351-c6686c336671","Type":"ContainerStarted","Data":"9d250edb0dee41b461118f3222b451bc8b1bb33097e42664d2dfde710e772671"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.505938 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rlqln" podStartSLOduration=155.505911653 podStartE2EDuration="2m35.505911653s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.439980366 +0000 UTC m=+218.902600935" watchObservedRunningTime="2026-02-27 06:14:00.505911653 +0000 UTC m=+218.968532222" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.526571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.526861 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.026849363 +0000 UTC m=+219.489469922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.531339 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" event={"ID":"ea0d4003-2819-4304-ad4f-0815dd53db79","Type":"ContainerStarted","Data":"9b3ecc236e8d5ff5e820ee01f58e852fae733d5d80383edf986df02c47576155"} Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.595262 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.602844 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.632055 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.633298 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.133266312 +0000 UTC m=+219.595886881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.707354 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-brxkd" podStartSLOduration=155.707334864 podStartE2EDuration="2m35.707334864s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.631989626 +0000 UTC m=+219.094610195" watchObservedRunningTime="2026-02-27 06:14:00.707334864 +0000 UTC m=+219.169955433" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.738735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.750946 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.250924792 +0000 UTC m=+219.713545361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.786030 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sj48h" podStartSLOduration=155.786012104 podStartE2EDuration="2m35.786012104s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.785662214 +0000 UTC m=+219.248282773" watchObservedRunningTime="2026-02-27 06:14:00.786012104 +0000 UTC m=+219.248632673" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.786920 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z2pbk" podStartSLOduration=155.786914579 podStartE2EDuration="2m35.786914579s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.709241637 +0000 UTC m=+219.171862226" watchObservedRunningTime="2026-02-27 06:14:00.786914579 +0000 UTC m=+219.249535148" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.839821 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.840628 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.340612267 +0000 UTC m=+219.803232836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.883664 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54342: no serving certificate available for the kubelet" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.898711 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" podStartSLOduration=156.898693646 podStartE2EDuration="2m36.898693646s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:00.890086467 +0000 UTC m=+219.352707046" watchObservedRunningTime="2026-02-27 06:14:00.898693646 +0000 UTC m=+219.361314215" Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.945232 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:00 crc kubenswrapper[4725]: E0227 06:14:00.945547 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.445535874 +0000 UTC m=+219.908156443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:00 crc kubenswrapper[4725]: I0227 06:14:00.952463 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-r5pzq" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.052765 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.053167 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.553149336 +0000 UTC m=+220.015769895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.156312 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.157040 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.657028334 +0000 UTC m=+220.119648893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.262131 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.263161 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.763137404 +0000 UTC m=+220.225757973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.366663 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.367165 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.867148496 +0000 UTC m=+220.329769065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.371443 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:01 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:14:01 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:01 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.371521 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.441944 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536214-p7k5h"] Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.468875 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.469333 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:01.969312477 +0000 UTC m=+220.431933036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.575229 4725 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zbhxj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.575570 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" podUID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.576495 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.576743 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.076732893 +0000 UTC m=+220.539353462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.601498 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0268ad135f86e356c5f2104b630a4c767f2190ad4a5fe00a9c04f2d7747e54a0"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.621255 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" event={"ID":"04f492bb-9555-42d8-9e3a-1e419c3d7607","Type":"ContainerStarted","Data":"01c93d851a3d35cf5a628f7e0a2843a6b570e7eace914c101be4a96e9fff98a9"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.621313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" event={"ID":"04f492bb-9555-42d8-9e3a-1e419c3d7607","Type":"ContainerStarted","Data":"ec162ec6700e0ea5d50773f27c4609e0f52066a1dc9fcc3a4a2bc5087d11bcf2"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.621580 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.632427 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" event={"ID":"49bf7d38-dc2d-4f9c-9050-202bb1e40747","Type":"ContainerStarted","Data":"f17606d2eebb00875ef4885edc126c97b78003f16ef1f15bc8fa6238a1047c06"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.632471 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" event={"ID":"49bf7d38-dc2d-4f9c-9050-202bb1e40747","Type":"ContainerStarted","Data":"97de93480e1fa53d9f2012237b3eb46e373e077ed06ceee40a6ec848a3e85cd6"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.646630 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" event={"ID":"6931a0dd-a354-441f-ae2c-3d6c3e59777e","Type":"ContainerStarted","Data":"56d5abbd493b2972fa8d47c10aec926ec133401199d399811930d0197ef30a08"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.650120 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" event={"ID":"c62ca9d3-ccb1-486e-9371-9f2b71893ec7","Type":"ContainerStarted","Data":"d9e82456708553ad66a8d3f2eac724499f858a8d60534227fc733107080f9a14"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.651979 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" event={"ID":"8aa612f1-e5b4-4f52-93d9-c52cedd7740e","Type":"ContainerStarted","Data":"cf5c36e37d032ea8b4c59c302c02d0dc54ad267899cd65cbb3c27585f911517a"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.679281 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ftrcl" event={"ID":"7b4a7f10-c643-4e7c-b4c9-76bef06ce76d","Type":"ContainerStarted","Data":"e0ec788a236bc9c2f5d82cf9160a6993eb9da7c06fe46f2c7fa5f368be0dae3d"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.679425 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7tszh" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.679790 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.680810 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.180782706 +0000 UTC m=+220.643403275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.696223 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" podStartSLOduration=156.696207043 podStartE2EDuration="2m36.696207043s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:01.666463569 +0000 UTC m=+220.129084138" watchObservedRunningTime="2026-02-27 06:14:01.696207043 +0000 UTC m=+220.158827612" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.697960 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vg2qp" podStartSLOduration=156.697952682 podStartE2EDuration="2m36.697952682s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:01.695672288 +0000 UTC m=+220.158292857" watchObservedRunningTime="2026-02-27 06:14:01.697952682 +0000 UTC m=+220.160573251" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.698282 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fa677fbebf74f6a9b379fb596ef05618a7cb6d1c679b10d4ba0d2a547f7ee2c0"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.698359 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.723719 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pmdnc" event={"ID":"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07","Type":"ContainerStarted","Data":"70cfc0ee3c0b96a89b95218866ca58c76f4de4406f93d2ca1cdc2355a314a300"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.723811 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pmdnc" event={"ID":"cc7b9cb5-ba37-4857-a03f-c63a32ad6c07","Type":"ContainerStarted","Data":"f48d4b5ffb417ea1214ce77b8c1564b02a12ac674e66da6ce29cf1e4161c2ab3"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.723907 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pmdnc" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.744973 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jmnj2" podStartSLOduration=156.744946964 podStartE2EDuration="2m36.744946964s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:01.741515959 +0000 UTC m=+220.204136528" watchObservedRunningTime="2026-02-27 06:14:01.744946964 +0000 UTC m=+220.207567533" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.755544 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" event={"ID":"2bc010a9-3840-4eaf-922c-9f67e82cb6bf","Type":"ContainerStarted","Data":"129f24eab555dd78736c85e68127c7175e8c1d93e334d76f2fe38fe64c053c52"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.756679 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.760421 4725 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-n8wkn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.760468 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" podUID="2bc010a9-3840-4eaf-922c-9f67e82cb6bf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.766238 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" event={"ID":"06c0abcb-fb62-4f62-b73e-a27620de9add","Type":"ContainerStarted","Data":"754f60393f81a0610c0029d04eb82deb299079ff5f6102ec2e926fb2204830d7"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.778389 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-mc28q" podStartSLOduration=156.77837017 podStartE2EDuration="2m36.77837017s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:01.773433193 +0000 UTC m=+220.236053772" watchObservedRunningTime="2026-02-27 06:14:01.77837017 +0000 UTC m=+220.240990739" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.781402 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.782816 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.282793432 +0000 UTC m=+220.745414001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.783410 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6lpfn" event={"ID":"5a24054f-e850-4d6b-b15e-e37115316230","Type":"ContainerStarted","Data":"294de298110e202b6e879a2f3e032a37d8e79526f9918645eb3a865d49a52207"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.804016 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" event={"ID":"6b2da58a-3e24-4e72-a25d-eeee730910cd","Type":"ContainerStarted","Data":"886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.804872 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.805827 4725 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zmwz5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.805870 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.815517 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" event={"ID":"1aadc96d-d9dd-4848-b4d1-6dc7579002f6","Type":"ContainerStarted","Data":"321d71da61b452130f1ed4d240072c75891d27dd0ce2ddceca2d3da18f51f75f"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.848058 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" event={"ID":"1002e916-f907-4ae4-bc0f-3a08f4b70f5f","Type":"ContainerStarted","Data":"0f9896f0cf66f53aed837d610b38e26b13b8798704ced965c3781fc31426c160"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.882465 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.883876 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.383850782 +0000 UTC m=+220.846471351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.900732 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pmdnc" podStartSLOduration=8.90071214 podStartE2EDuration="8.90071214s" podCreationTimestamp="2026-02-27 06:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:01.873590408 +0000 UTC m=+220.336210977" watchObservedRunningTime="2026-02-27 06:14:01.90071214 +0000 UTC m=+220.363332709" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.903110 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" event={"ID":"cfc908de-22b4-4ca7-a60e-1ed8592f563e","Type":"ContainerStarted","Data":"f6690517af0c269426e81f94be63671fcbf7b6a0e6cb01e97ce031f3ac538d75"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.908272 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" event={"ID":"ec8abb87-07ac-4227-a787-6db444002c0c","Type":"ContainerStarted","Data":"6be5eb062f1ed509cf8e7a71e52c45c73ca9f6efb1abc61be04c887129d8ae59"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.910928 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"35220c12c2662c2739a25e39c8fd60c4ec436c51e342f0a4f68ae1bf9f38aa67"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.931646 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" event={"ID":"a7826377-b3f7-47a0-8967-04fa1243de5f","Type":"ContainerStarted","Data":"e86c164a9b6d45eeb0a12425f734be7d6ee9bc2c50173c98c52135e49c55a396"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.931699 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" event={"ID":"a7826377-b3f7-47a0-8967-04fa1243de5f","Type":"ContainerStarted","Data":"3773e6c1bb886bf2bb0fed4555d7a1e2f7b0790f72be780a09d838f10ff2722a"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.939157 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" event={"ID":"4ac52b3f-d9d6-499e-b0e7-d2fb07de2780","Type":"ContainerStarted","Data":"67309e322b348ea225daeaace56a1eddc4adeaa5c8d4ba7c11fd336a450e5478"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.941364 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" event={"ID":"5d1908a5-f826-4eae-a6fa-c899dda28b57","Type":"ContainerStarted","Data":"6f3f0781b6836ad165a8d34c5aacfb80cf7fb492aa0d8b82cf018e02b33cbde1"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.963713 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ftrcl" podStartSLOduration=8.963691354 podStartE2EDuration="8.963691354s" podCreationTimestamp="2026-02-27 06:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:01.962330067 +0000 UTC m=+220.424950646" watchObservedRunningTime="2026-02-27 06:14:01.963691354 +0000 UTC m=+220.426311923" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.972833 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" event={"ID":"23b57c5c-dc24-40cd-8563-9e43ddc844dc","Type":"ContainerStarted","Data":"19bfb16e6b23c79f1bc868e871f0098eeccd6f68297ce039836d1a49435f75ee"} Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.973619 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:14:01 crc kubenswrapper[4725]: I0227 06:14:01.985472 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:01 crc kubenswrapper[4725]: E0227 06:14:01.989276 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.489254882 +0000 UTC m=+220.951875451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.006521 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-p5m2x" event={"ID":"5e72c156-e23c-414e-a88d-5cef5965f47e","Type":"ContainerStarted","Data":"b9f83b930804001fdbc426648125f1aee719301d0004a29a19bd634c86a6dc68"} Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.036780 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" event={"ID":"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b","Type":"ContainerStarted","Data":"46303f5efb305c6f027214ea28be51e4a8da5c2f03b125dedfdd0a818fa9a781"} Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.036852 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" event={"ID":"cdf64ed0-6837-4ef7-b2c5-2e5880e1385b","Type":"ContainerStarted","Data":"ab1f48fc568864efd38e99957ba0a354f3bd651fd01ec51339cb02f542a00197"} Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.038279 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" podUID="cdfdd960-ccbf-4554-b737-aa1c1f1e6572" containerName="route-controller-manager" containerID="cri-o://1f5d92e9b0277867ae34183c7de3521ce3294e87b271aa61fe81ad633be0ca61" gracePeriod=30 Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.039342 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" podUID="eda518bc-5412-413a-8958-ea97c24a9795" containerName="controller-manager" containerID="cri-o://f55ec42995860ce348d166b7e2a2e0abcb0039daeaa8b22c469cbe5b04c58e80" gracePeriod=30 Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.058739 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.089760 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.092265 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.592236345 +0000 UTC m=+221.054856914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.099422 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" podStartSLOduration=157.099390213 podStartE2EDuration="2m37.099390213s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.079634066 +0000 UTC m=+220.542254645" watchObservedRunningTime="2026-02-27 06:14:02.099390213 +0000 UTC m=+220.562010782" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.143703 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-p5m2x" podStartSLOduration=9.14365411 podStartE2EDuration="9.14365411s" podCreationTimestamp="2026-02-27 06:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.134296351 +0000 UTC m=+220.596916920" watchObservedRunningTime="2026-02-27 06:14:02.14365411 +0000 UTC m=+220.606274679" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.192998 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.193580 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.693566843 +0000 UTC m=+221.156187412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.237752 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54350: no serving certificate available for the kubelet" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.251305 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ndprs" podStartSLOduration=157.251264622 podStartE2EDuration="2m37.251264622s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.245907583 +0000 UTC m=+220.708528162" watchObservedRunningTime="2026-02-27 06:14:02.251264622 +0000 UTC m=+220.713885191" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.275637 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-g7kxx" podStartSLOduration=157.275614306 podStartE2EDuration="2m37.275614306s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.27393824 +0000 UTC m=+220.736558819" watchObservedRunningTime="2026-02-27 06:14:02.275614306 +0000 UTC m=+220.738234875" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.295383 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.295683 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.795662152 +0000 UTC m=+221.258282721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.303551 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8mzb" podStartSLOduration=157.3035339 podStartE2EDuration="2m37.3035339s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.301870124 +0000 UTC m=+220.764490693" watchObservedRunningTime="2026-02-27 06:14:02.3035339 +0000 UTC m=+220.766154469" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.370034 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:02 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:14:02 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:02 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.370141 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.386270 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" podStartSLOduration=158.386248302 podStartE2EDuration="2m38.386248302s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.346459219 +0000 UTC m=+220.809079798" watchObservedRunningTime="2026-02-27 06:14:02.386248302 +0000 UTC m=+220.848868871" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.398423 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.399013 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:02.898994985 +0000 UTC m=+221.361615554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.432054 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9v749" podStartSLOduration=157.43203124 podStartE2EDuration="2m37.43203124s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.427944757 +0000 UTC m=+220.890565326" watchObservedRunningTime="2026-02-27 06:14:02.43203124 +0000 UTC m=+220.894651809" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.461923 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fgnvc" podStartSLOduration=157.461901428 podStartE2EDuration="2m37.461901428s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.457768133 +0000 UTC m=+220.920388702" watchObservedRunningTime="2026-02-27 06:14:02.461901428 +0000 UTC m=+220.924521997" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.486265 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" podStartSLOduration=157.486238032 podStartE2EDuration="2m37.486238032s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.485995405 +0000 UTC m=+220.948615984" watchObservedRunningTime="2026-02-27 06:14:02.486238032 +0000 UTC m=+220.948858601" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.500001 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.500330 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.000312822 +0000 UTC m=+221.462933381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.514788 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" podStartSLOduration=157.514770223 podStartE2EDuration="2m37.514770223s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:02.513407305 +0000 UTC m=+220.976027884" watchObservedRunningTime="2026-02-27 06:14:02.514770223 +0000 UTC m=+220.977390792" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.554641 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.554719 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.601603 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.602017 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.10200093 +0000 UTC m=+221.564621499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.702603 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.702969 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.202942577 +0000 UTC m=+221.665563146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.703177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.703664 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.203558414 +0000 UTC m=+221.666178983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.803855 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.804046 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.304018297 +0000 UTC m=+221.766638866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.804478 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.804758 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.304745097 +0000 UTC m=+221.767365666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:02 crc kubenswrapper[4725]: I0227 06:14:02.907755 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:02 crc kubenswrapper[4725]: E0227 06:14:02.908095 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.40807874 +0000 UTC m=+221.870699309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.008864 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.009157 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.509144711 +0000 UTC m=+221.971765280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.014914 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.043432 4725 generic.go:334] "Generic (PLEG): container finished" podID="eda518bc-5412-413a-8958-ea97c24a9795" containerID="f55ec42995860ce348d166b7e2a2e0abcb0039daeaa8b22c469cbe5b04c58e80" exitCode=0 Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.043496 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" event={"ID":"eda518bc-5412-413a-8958-ea97c24a9795","Type":"ContainerDied","Data":"f55ec42995860ce348d166b7e2a2e0abcb0039daeaa8b22c469cbe5b04c58e80"} Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.056387 4725 generic.go:334] "Generic (PLEG): container finished" podID="cdfdd960-ccbf-4554-b737-aa1c1f1e6572" containerID="1f5d92e9b0277867ae34183c7de3521ce3294e87b271aa61fe81ad633be0ca61" exitCode=0 Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.057632 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" event={"ID":"cdfdd960-ccbf-4554-b737-aa1c1f1e6572","Type":"ContainerDied","Data":"1f5d92e9b0277867ae34183c7de3521ce3294e87b271aa61fe81ad633be0ca61"} Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.060387 4725 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zmwz5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.060740 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.067786 4725 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-9zfdj container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.067910 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" podUID="23b57c5c-dc24-40cd-8563-9e43ddc844dc" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.091191 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.115880 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.116964 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.616726752 +0000 UTC m=+222.079347321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.149822 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr"] Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.166969 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfdd960-ccbf-4554-b737-aa1c1f1e6572" containerName="route-controller-manager" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.166997 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfdd960-ccbf-4554-b737-aa1c1f1e6572" containerName="route-controller-manager" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.167176 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfdd960-ccbf-4554-b737-aa1c1f1e6572" containerName="route-controller-manager" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.167721 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.190330 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.190357 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.216902 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnnwp\" (UniqueName: \"kubernetes.io/projected/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-kube-api-access-bnnwp\") pod \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.217167 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-config\") pod \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.217379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-serving-cert\") pod \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.217457 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-client-ca\") pod \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\" (UID: \"cdfdd960-ccbf-4554-b737-aa1c1f1e6572\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.218498 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.218563 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-config" (OuterVolumeSpecName: "config") pod "cdfdd960-ccbf-4554-b737-aa1c1f1e6572" (UID: "cdfdd960-ccbf-4554-b737-aa1c1f1e6572"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.218797 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.222695 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-client-ca" (OuterVolumeSpecName: "client-ca") pod "cdfdd960-ccbf-4554-b737-aa1c1f1e6572" (UID: "cdfdd960-ccbf-4554-b737-aa1c1f1e6572"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.228908 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.728895059 +0000 UTC m=+222.191515628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.258379 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cdfdd960-ccbf-4554-b737-aa1c1f1e6572" (UID: "cdfdd960-ccbf-4554-b737-aa1c1f1e6572"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.285740 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-kube-api-access-bnnwp" (OuterVolumeSpecName: "kube-api-access-bnnwp") pod "cdfdd960-ccbf-4554-b737-aa1c1f1e6572" (UID: "cdfdd960-ccbf-4554-b737-aa1c1f1e6572"). InnerVolumeSpecName "kube-api-access-bnnwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.327549 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b6cpp"] Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.327733 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda518bc-5412-413a-8958-ea97c24a9795" containerName="controller-manager" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.327744 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda518bc-5412-413a-8958-ea97c24a9795" containerName="controller-manager" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.327833 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda518bc-5412-413a-8958-ea97c24a9795" containerName="controller-manager" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.328562 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.329677 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-config\") pod \"eda518bc-5412-413a-8958-ea97c24a9795\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.329828 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.329865 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eda518bc-5412-413a-8958-ea97c24a9795-serving-cert\") pod \"eda518bc-5412-413a-8958-ea97c24a9795\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330019 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-client-ca\") pod \"eda518bc-5412-413a-8958-ea97c24a9795\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330055 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbh79\" (UniqueName: \"kubernetes.io/projected/eda518bc-5412-413a-8958-ea97c24a9795-kube-api-access-sbh79\") pod \"eda518bc-5412-413a-8958-ea97c24a9795\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330082 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-proxy-ca-bundles\") pod \"eda518bc-5412-413a-8958-ea97c24a9795\" (UID: \"eda518bc-5412-413a-8958-ea97c24a9795\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330282 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf2fd61-154b-4d80-b6c5-add580271096-serving-cert\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330352 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-client-ca\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330378 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js26w\" (UniqueName: \"kubernetes.io/projected/7cf2fd61-154b-4d80-b6c5-add580271096-kube-api-access-js26w\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330408 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-config\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330502 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnnwp\" (UniqueName: \"kubernetes.io/projected/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-kube-api-access-bnnwp\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330518 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.330529 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfdd960-ccbf-4554-b737-aa1c1f1e6572-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.331415 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-config" (OuterVolumeSpecName: "config") pod "eda518bc-5412-413a-8958-ea97c24a9795" (UID: "eda518bc-5412-413a-8958-ea97c24a9795"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.331502 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.831483042 +0000 UTC m=+222.294103611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.332267 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-client-ca" (OuterVolumeSpecName: "client-ca") pod "eda518bc-5412-413a-8958-ea97c24a9795" (UID: "eda518bc-5412-413a-8958-ea97c24a9795"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.332392 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eda518bc-5412-413a-8958-ea97c24a9795" (UID: "eda518bc-5412-413a-8958-ea97c24a9795"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.344708 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.344903 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda518bc-5412-413a-8958-ea97c24a9795-kube-api-access-sbh79" (OuterVolumeSpecName: "kube-api-access-sbh79") pod "eda518bc-5412-413a-8958-ea97c24a9795" (UID: "eda518bc-5412-413a-8958-ea97c24a9795"). InnerVolumeSpecName "kube-api-access-sbh79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.349596 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6cpp"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.379948 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:03 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:14:03 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:03 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.380008 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.380068 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda518bc-5412-413a-8958-ea97c24a9795-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eda518bc-5412-413a-8958-ea97c24a9795" (UID: "eda518bc-5412-413a-8958-ea97c24a9795"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.431877 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js26w\" (UniqueName: \"kubernetes.io/projected/7cf2fd61-154b-4d80-b6c5-add580271096-kube-api-access-js26w\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.431940 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-config\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.431976 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz9d2\" (UniqueName: \"kubernetes.io/projected/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-kube-api-access-kz9d2\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.431997 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432035 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-utilities\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432055 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-catalog-content\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432074 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf2fd61-154b-4d80-b6c5-add580271096-serving-cert\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432098 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-client-ca\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432129 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432139 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbh79\" (UniqueName: \"kubernetes.io/projected/eda518bc-5412-413a-8958-ea97c24a9795-kube-api-access-sbh79\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432148 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432156 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda518bc-5412-413a-8958-ea97c24a9795-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432163 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eda518bc-5412-413a-8958-ea97c24a9795-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.432929 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-client-ca\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.433917 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-config\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.434154 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:03.934142656 +0000 UTC m=+222.396763225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.439523 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf2fd61-154b-4d80-b6c5-add580271096-serving-cert\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.450113 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js26w\" (UniqueName: \"kubernetes.io/projected/7cf2fd61-154b-4d80-b6c5-add580271096-kube-api-access-js26w\") pod \"route-controller-manager-6985668574-8g7qr\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.511405 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-htnhk"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.512527 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.524105 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.531517 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-htnhk"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.533129 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.533358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-utilities\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.533389 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-catalog-content\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.533454 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz9d2\" (UniqueName: \"kubernetes.io/projected/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-kube-api-access-kz9d2\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.533784 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.033768077 +0000 UTC m=+222.496388646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.534142 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-utilities\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.534363 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-catalog-content\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.548370 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.571064 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz9d2\" (UniqueName: \"kubernetes.io/projected/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-kube-api-access-kz9d2\") pod \"certified-operators-b6cpp\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.572810 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.573426 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.580339 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.580558 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.582554 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.641640 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-catalog-content\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.641699 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7j4\" (UniqueName: \"kubernetes.io/projected/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-kube-api-access-zv7j4\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.641741 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-utilities\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.641785 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.642079 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.142067288 +0000 UTC m=+222.604687847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.650925 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8wkn" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.700429 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.723052 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j6hj6"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.726571 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.748588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.748834 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-utilities\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.748868 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c43677-5742-41b3-9f77-2c22d965a7e6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.748951 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.248915988 +0000 UTC m=+222.711536557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.749042 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c43677-5742-41b3-9f77-2c22d965a7e6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.749123 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.749187 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-catalog-content\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.749276 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7j4\" (UniqueName: \"kubernetes.io/projected/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-kube-api-access-zv7j4\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.749302 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-utilities\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.749607 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.249590407 +0000 UTC m=+222.712210976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.749753 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-catalog-content\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.751064 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j6hj6"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.786182 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7j4\" (UniqueName: \"kubernetes.io/projected/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-kube-api-access-zv7j4\") pod \"community-operators-htnhk\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.850333 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.850491 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.850742 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-utilities\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.850783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c43677-5742-41b3-9f77-2c22d965a7e6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.850813 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c43677-5742-41b3-9f77-2c22d965a7e6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.850850 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2bg4\" (UniqueName: \"kubernetes.io/projected/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-kube-api-access-w2bg4\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.850899 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-catalog-content\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.850998 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.350980276 +0000 UTC m=+222.813600835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.851046 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c43677-5742-41b3-9f77-2c22d965a7e6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.873225 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c43677-5742-41b3-9f77-2c22d965a7e6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.921893 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.923698 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8rj5q"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.924697 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.929981 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rj5q"] Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.952897 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-catalog-content\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.952947 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-utilities\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.953005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.953024 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2bg4\" (UniqueName: \"kubernetes.io/projected/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-kube-api-access-w2bg4\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.953674 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-catalog-content\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.953879 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-utilities\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:03 crc kubenswrapper[4725]: E0227 06:14:03.954202 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.454185345 +0000 UTC m=+222.916805914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:03 crc kubenswrapper[4725]: I0227 06:14:03.993501 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2bg4\" (UniqueName: \"kubernetes.io/projected/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-kube-api-access-w2bg4\") pod \"certified-operators-j6hj6\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.003885 4725 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.054119 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.054335 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-catalog-content\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.054365 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-utilities\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.054413 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55ss\" (UniqueName: \"kubernetes.io/projected/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-kube-api-access-j55ss\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: E0227 06:14:04.054543 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.554527656 +0000 UTC m=+223.017148225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.067263 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.097824 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.127895 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" event={"ID":"1aadc96d-d9dd-4848-b4d1-6dc7579002f6","Type":"ContainerStarted","Data":"919f0aacc8e81a04dd45d6ff54512e06f789edd6b507ef473ee9ef02a15dd0c7"} Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.160873 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" event={"ID":"eda518bc-5412-413a-8958-ea97c24a9795","Type":"ContainerDied","Data":"4ba6fd90af4e33eb5bb36918010e39461cd9388d94a056b03542882de1cf251f"} Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.161184 4725 scope.go:117] "RemoveContainer" containerID="f55ec42995860ce348d166b7e2a2e0abcb0039daeaa8b22c469cbe5b04c58e80" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.161206 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn747" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.164713 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-catalog-content\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.164751 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-utilities\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.164794 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55ss\" (UniqueName: \"kubernetes.io/projected/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-kube-api-access-j55ss\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.164839 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:04 crc kubenswrapper[4725]: E0227 06:14:04.165080 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.665068799 +0000 UTC m=+223.127689368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.165450 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-catalog-content\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.165649 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-utilities\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.180869 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6cpp"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.215253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55ss\" (UniqueName: \"kubernetes.io/projected/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-kube-api-access-j55ss\") pod \"community-operators-8rj5q\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.228553 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.230111 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn" event={"ID":"cdfdd960-ccbf-4554-b737-aa1c1f1e6572","Type":"ContainerDied","Data":"0603ee1a84d64793a72106d3cbc363cb44c27597d29e31e4363cf613045a534a"} Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.267554 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:04 crc kubenswrapper[4725]: E0227 06:14:04.267844 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.767829506 +0000 UTC m=+223.230450075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.268334 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.369537 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:04 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:14:04 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:04 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.369587 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.371113 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:04 crc kubenswrapper[4725]: E0227 06:14:04.372256 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.872243309 +0000 UTC m=+223.334863878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.392161 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn747"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.393599 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn747"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.436056 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.439368 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lv6kn"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.472665 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:04 crc kubenswrapper[4725]: E0227 06:14:04.473047 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 06:14:04.973030491 +0000 UTC m=+223.435651060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.496460 4725 scope.go:117] "RemoveContainer" containerID="1f5d92e9b0277867ae34183c7de3521ce3294e87b271aa61fe81ad633be0ca61" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.512202 4725 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-27T06:14:04.003918754Z","Handler":null,"Name":""} Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.541024 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.562527 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-27d6j" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.575106 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:04 crc kubenswrapper[4725]: E0227 06:14:04.575837 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 06:14:05.0758255 +0000 UTC m=+223.538446069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p26pd" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.578742 4725 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.578781 4725 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.648710 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.666654 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-htnhk"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.676502 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.728836 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.763214 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rj5q"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.779545 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.788219 4725 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.788263 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.807565 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j6hj6"] Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.836895 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54358: no serving certificate available for the kubelet" Feb 27 06:14:04 crc kubenswrapper[4725]: W0227 06:14:04.850082 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfc8e5f_5a0f_4384_a2af_0817928d8ba5.slice/crio-84da428a1b6f81b9ee13ab3ebe621cd9816f9a2cd1daad7a65f0361268069b32 WatchSource:0}: Error finding container 84da428a1b6f81b9ee13ab3ebe621cd9816f9a2cd1daad7a65f0361268069b32: Status 404 returned error can't find the container with id 84da428a1b6f81b9ee13ab3ebe621cd9816f9a2cd1daad7a65f0361268069b32 Feb 27 06:14:04 crc kubenswrapper[4725]: I0227 06:14:04.852602 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p26pd\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.003316 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.217114 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c86f7b94d-mqkrx"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.218578 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.226027 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.226264 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.226429 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.226637 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.226756 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.228831 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.233055 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c86f7b94d-mqkrx"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.244138 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.249622 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6c43677-5742-41b3-9f77-2c22d965a7e6","Type":"ContainerStarted","Data":"a5d59bce631a170204a420cb24044172e349bc0af424191d4890a7b172096d04"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.277956 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" event={"ID":"7cf2fd61-154b-4d80-b6c5-add580271096","Type":"ContainerStarted","Data":"a48027c83e457a5aae688b12c8a12ecd80e64b547b5f3f9bc9bea855fc7e0118"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.277995 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" event={"ID":"7cf2fd61-154b-4d80-b6c5-add580271096","Type":"ContainerStarted","Data":"62fb44b7d515ef6918696e3acbb7772963a26ac39499e1409120d2996442928d"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.279313 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.281716 4725 generic.go:334] "Generic (PLEG): container finished" podID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerID="228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942" exitCode=0 Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.282048 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htnhk" event={"ID":"4e0ac478-aa78-481d-84d3-f4a5c6bedadb","Type":"ContainerDied","Data":"228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.282085 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htnhk" event={"ID":"4e0ac478-aa78-481d-84d3-f4a5c6bedadb","Type":"ContainerStarted","Data":"ed4e30b6b4845151a4ac08e4f175981cc02d10a72d2e2342f38e4800eee84cdc"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.285054 4725 generic.go:334] "Generic (PLEG): container finished" podID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerID="2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce" exitCode=0 Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.286112 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6cpp" event={"ID":"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd","Type":"ContainerDied","Data":"2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.286156 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6cpp" event={"ID":"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd","Type":"ContainerStarted","Data":"09eb3998fde19b40b7e31b963286b117407ae1faadef4cb1156a08f353899e60"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.286950 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9zfdj" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.305410 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.311938 4725 generic.go:334] "Generic (PLEG): container finished" podID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerID="0768a8fa11129655d4b1d82f0bd5334f11ec5c75ca7feee88c082eaed04be0d3" exitCode=0 Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.312111 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rj5q" event={"ID":"3965afd0-6cf4-4ea2-86a1-ce69bb98f260","Type":"ContainerDied","Data":"0768a8fa11129655d4b1d82f0bd5334f11ec5c75ca7feee88c082eaed04be0d3"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.312151 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rj5q" event={"ID":"3965afd0-6cf4-4ea2-86a1-ce69bb98f260","Type":"ContainerStarted","Data":"6aa1725f90bf8dd86a7801592b7f59e01f359a9d614dbe2388373c587bba8d3a"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.324257 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2q5b8"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.326505 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" podStartSLOduration=4.326485629 podStartE2EDuration="4.326485629s" podCreationTimestamp="2026-02-27 06:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:05.318093216 +0000 UTC m=+223.780713795" watchObservedRunningTime="2026-02-27 06:14:05.326485629 +0000 UTC m=+223.789106198" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.327647 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.344765 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.350985 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q5b8"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.373373 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:05 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:14:05 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:05 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.373435 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.383771 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" event={"ID":"1aadc96d-d9dd-4848-b4d1-6dc7579002f6","Type":"ContainerStarted","Data":"e25b65060aa6f4ac1d4408e7b95d96a23f9291837ec633bca08383b21959163b"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.383819 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" event={"ID":"1aadc96d-d9dd-4848-b4d1-6dc7579002f6","Type":"ContainerStarted","Data":"578d4b9f472e324ec53e49f93a6616fb6000896b955bf44afc3b9640f241508c"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.405408 4725 generic.go:334] "Generic (PLEG): container finished" podID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerID="8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339" exitCode=0 Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.405598 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6hj6" event={"ID":"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5","Type":"ContainerDied","Data":"8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.405628 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6hj6" event={"ID":"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5","Type":"ContainerStarted","Data":"84da428a1b6f81b9ee13ab3ebe621cd9816f9a2cd1daad7a65f0361268069b32"} Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.432643 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-client-ca\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.432896 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-config\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.432932 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-proxy-ca-bundles\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.433000 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drmb\" (UniqueName: \"kubernetes.io/projected/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-kube-api-access-8drmb\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.433039 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-serving-cert\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.533990 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-64j8f" podStartSLOduration=12.533965618 podStartE2EDuration="12.533965618s" podCreationTimestamp="2026-02-27 06:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:05.531391856 +0000 UTC m=+223.994012415" watchObservedRunningTime="2026-02-27 06:14:05.533965618 +0000 UTC m=+223.996586187" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535309 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-utilities\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535401 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-config\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535425 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-proxy-ca-bundles\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535453 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8drmb\" (UniqueName: \"kubernetes.io/projected/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-kube-api-access-8drmb\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535479 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-serving-cert\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535505 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/c008fcf9-f898-434a-b077-f8921e01be05-kube-api-access-9wwv9\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535532 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-catalog-content\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.535559 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-client-ca\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.537941 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-client-ca\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.540365 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-config\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.542455 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-proxy-ca-bundles\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.550067 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p26pd"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.567905 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drmb\" (UniqueName: \"kubernetes.io/projected/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-kube-api-access-8drmb\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.567996 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-serving-cert\") pod \"controller-manager-c86f7b94d-mqkrx\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.636335 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-utilities\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.636750 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/c008fcf9-f898-434a-b077-f8921e01be05-kube-api-access-9wwv9\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.636784 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-catalog-content\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.637342 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-catalog-content\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.637567 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-utilities\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.657851 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-b58nc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.657871 4725 patch_prober.go:28] interesting pod/downloads-7954f5f757-b58nc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.657909 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b58nc" podUID="6e178501-6e8d-4fcd-abf4-b13cd287501b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.657919 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b58nc" podUID="6e178501-6e8d-4fcd-abf4-b13cd287501b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.670898 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/c008fcf9-f898-434a-b077-f8921e01be05-kube-api-access-9wwv9\") pod \"redhat-marketplace-2q5b8\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.683771 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.706821 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lblv9"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.708401 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.713770 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lblv9"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.838533 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-catalog-content\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.838616 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-utilities\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.838642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l58ng\" (UniqueName: \"kubernetes.io/projected/2079d9d5-1660-4e80-a909-40d68fbe3c87-kube-api-access-l58ng\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.863269 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.940685 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-catalog-content\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.941365 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-catalog-content\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.944518 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q5b8"] Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.944669 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-utilities\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.944962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-utilities\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.945011 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l58ng\" (UniqueName: \"kubernetes.io/projected/2079d9d5-1660-4e80-a909-40d68fbe3c87-kube-api-access-l58ng\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.947894 4725 patch_prober.go:28] interesting pod/console-f9d7485db-qr9hb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.947955 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qr9hb" podUID="b0dcb291-e867-45d5-91f3-fa9b18a090c5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.948230 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.949571 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:14:05 crc kubenswrapper[4725]: I0227 06:14:05.971587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l58ng\" (UniqueName: \"kubernetes.io/projected/2079d9d5-1660-4e80-a909-40d68fbe3c87-kube-api-access-l58ng\") pod \"redhat-marketplace-lblv9\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.024395 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.025208 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.026162 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.030038 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.030225 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.046582 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.153923 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.154757 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.159791 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c86f7b94d-mqkrx"] Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.256477 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.256543 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.256588 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.268194 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.268932 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfdd960-ccbf-4554-b737-aa1c1f1e6572" path="/var/lib/kubelet/pods/cdfdd960-ccbf-4554-b737-aa1c1f1e6572/volumes" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.269473 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda518bc-5412-413a-8958-ea97c24a9795" path="/var/lib/kubelet/pods/eda518bc-5412-413a-8958-ea97c24a9795/volumes" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.280552 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.306403 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.308200 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.328443 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.366621 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.372059 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.375500 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:06 crc kubenswrapper[4725]: [-]has-synced failed: reason withheld Feb 27 06:14:06 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:06 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.375783 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.456468 4725 generic.go:334] "Generic (PLEG): container finished" podID="5d1908a5-f826-4eae-a6fa-c899dda28b57" containerID="6f3f0781b6836ad165a8d34c5aacfb80cf7fb492aa0d8b82cf018e02b33cbde1" exitCode=0 Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.456582 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" event={"ID":"5d1908a5-f826-4eae-a6fa-c899dda28b57","Type":"ContainerDied","Data":"6f3f0781b6836ad165a8d34c5aacfb80cf7fb492aa0d8b82cf018e02b33cbde1"} Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.462705 4725 generic.go:334] "Generic (PLEG): container finished" podID="c6c43677-5742-41b3-9f77-2c22d965a7e6" containerID="1b00ef20601e25eb269770540a19e2f30615571396e145106eee84872061d58f" exitCode=0 Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.462770 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6c43677-5742-41b3-9f77-2c22d965a7e6","Type":"ContainerDied","Data":"1b00ef20601e25eb269770540a19e2f30615571396e145106eee84872061d58f"} Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.507327 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86fp7"] Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.509207 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.516296 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" event={"ID":"61f0c8ed-3e65-4e64-8806-1131c31ca4c3","Type":"ContainerStarted","Data":"cda051a38bbe960a8fb4c0e5a72992939190e9d1ac336baaab646697ce203efb"} Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.517347 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.524172 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" event={"ID":"a58d5af7-837b-45b1-a3cb-ffc3172f54e1","Type":"ContainerStarted","Data":"d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1"} Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.526273 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" event={"ID":"a58d5af7-837b-45b1-a3cb-ffc3172f54e1","Type":"ContainerStarted","Data":"e0bd00c59d5bdb614f56ace5d9f940a23d33f77fed6aac059fb43edb4faf1961"} Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.526320 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.531949 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86fp7"] Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.547986 4725 generic.go:334] "Generic (PLEG): container finished" podID="c008fcf9-f898-434a-b077-f8921e01be05" containerID="a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215" exitCode=0 Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.548702 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q5b8" event={"ID":"c008fcf9-f898-434a-b077-f8921e01be05","Type":"ContainerDied","Data":"a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215"} Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.548735 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q5b8" event={"ID":"c008fcf9-f898-434a-b077-f8921e01be05","Type":"ContainerStarted","Data":"1c40e479553f18fc6aa4daa59865cc197c61916a8d2284e94023f40aa6defb2e"} Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.554073 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cmp9v" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.572475 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" podStartSLOduration=161.572458691 podStartE2EDuration="2m41.572458691s" podCreationTimestamp="2026-02-27 06:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:06.56521404 +0000 UTC m=+225.027834619" watchObservedRunningTime="2026-02-27 06:14:06.572458691 +0000 UTC m=+225.035079260" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.670953 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mbl\" (UniqueName: \"kubernetes.io/projected/1295a124-164c-403c-8eb6-f71c3a9dc8a7-kube-api-access-s7mbl\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.670999 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-utilities\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.671184 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-catalog-content\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.702183 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 06:14:06 crc kubenswrapper[4725]: W0227 06:14:06.715436 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda4d9ed0e_8ac7_4835_bf4d_ecb030a78fd7.slice/crio-a74a7e4a111e78515451fe7df651638cdbfada29cd1dc03c3988f04bb992204c WatchSource:0}: Error finding container a74a7e4a111e78515451fe7df651638cdbfada29cd1dc03c3988f04bb992204c: Status 404 returned error can't find the container with id a74a7e4a111e78515451fe7df651638cdbfada29cd1dc03c3988f04bb992204c Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.731040 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.772742 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mbl\" (UniqueName: \"kubernetes.io/projected/1295a124-164c-403c-8eb6-f71c3a9dc8a7-kube-api-access-s7mbl\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.772786 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-utilities\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.772847 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-catalog-content\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.773517 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-catalog-content\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.774096 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-utilities\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.799647 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lblv9"] Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.807227 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mbl\" (UniqueName: \"kubernetes.io/projected/1295a124-164c-403c-8eb6-f71c3a9dc8a7-kube-api-access-s7mbl\") pod \"redhat-operators-86fp7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: W0227 06:14:06.834757 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2079d9d5_1660_4e80_a909_40d68fbe3c87.slice/crio-8bebb9102acb5332391ae798dc3cd2e049ad5636f5d37641043fe8cbc21b856d WatchSource:0}: Error finding container 8bebb9102acb5332391ae798dc3cd2e049ad5636f5d37641043fe8cbc21b856d: Status 404 returned error can't find the container with id 8bebb9102acb5332391ae798dc3cd2e049ad5636f5d37641043fe8cbc21b856d Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.856149 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.905403 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7q5w4"] Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.906456 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:06 crc kubenswrapper[4725]: I0227 06:14:06.914766 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7q5w4"] Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.084071 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-catalog-content\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.084563 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-utilities\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.084595 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv9fh\" (UniqueName: \"kubernetes.io/projected/c54a518b-2ef3-4edc-9148-80dd4485fc90-kube-api-access-jv9fh\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.186444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-catalog-content\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.186520 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-utilities\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.186554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv9fh\" (UniqueName: \"kubernetes.io/projected/c54a518b-2ef3-4edc-9148-80dd4485fc90-kube-api-access-jv9fh\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.187657 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-catalog-content\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.187733 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-utilities\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.238133 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv9fh\" (UniqueName: \"kubernetes.io/projected/c54a518b-2ef3-4edc-9148-80dd4485fc90-kube-api-access-jv9fh\") pod \"redhat-operators-7q5w4\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.241643 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.366556 4725 patch_prober.go:28] interesting pod/router-default-5444994796-28snx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 06:14:07 crc kubenswrapper[4725]: [+]has-synced ok Feb 27 06:14:07 crc kubenswrapper[4725]: [+]process-running ok Feb 27 06:14:07 crc kubenswrapper[4725]: healthz check failed Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.366645 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28snx" podUID="40ad543f-d10f-4d38-bf97-fc1fd668e30f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.530655 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86fp7"] Feb 27 06:14:07 crc kubenswrapper[4725]: W0227 06:14:07.549622 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1295a124_164c_403c_8eb6_f71c3a9dc8a7.slice/crio-5ec958d1581966cf1613ed6f2bc01e41c40a90cf2eaa2430593de66b1eb27efc WatchSource:0}: Error finding container 5ec958d1581966cf1613ed6f2bc01e41c40a90cf2eaa2430593de66b1eb27efc: Status 404 returned error can't find the container with id 5ec958d1581966cf1613ed6f2bc01e41c40a90cf2eaa2430593de66b1eb27efc Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.612592 4725 generic.go:334] "Generic (PLEG): container finished" podID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerID="d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e" exitCode=0 Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.612716 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lblv9" event={"ID":"2079d9d5-1660-4e80-a909-40d68fbe3c87","Type":"ContainerDied","Data":"d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e"} Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.612749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lblv9" event={"ID":"2079d9d5-1660-4e80-a909-40d68fbe3c87","Type":"ContainerStarted","Data":"8bebb9102acb5332391ae798dc3cd2e049ad5636f5d37641043fe8cbc21b856d"} Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.626157 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7","Type":"ContainerStarted","Data":"a74a7e4a111e78515451fe7df651638cdbfada29cd1dc03c3988f04bb992204c"} Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.702222 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" event={"ID":"61f0c8ed-3e65-4e64-8806-1131c31ca4c3","Type":"ContainerStarted","Data":"7c2cb0d82e4141370233045c086b0f95edbaab0c6df5fabf0b6139f19153cd3c"} Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.703950 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.737023 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.756237 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" podStartSLOduration=6.75621087 podStartE2EDuration="6.75621087s" podCreationTimestamp="2026-02-27 06:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:07.755216292 +0000 UTC m=+226.217836881" watchObservedRunningTime="2026-02-27 06:14:07.75621087 +0000 UTC m=+226.218831439" Feb 27 06:14:07 crc kubenswrapper[4725]: I0227 06:14:07.964264 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7q5w4"] Feb 27 06:14:08 crc kubenswrapper[4725]: W0227 06:14:08.005683 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a518b_2ef3_4edc_9148_80dd4485fc90.slice/crio-555cbb8f753487898528303c8b1087910a511d4e05ac466869c87c8d69dffeb4 WatchSource:0}: Error finding container 555cbb8f753487898528303c8b1087910a511d4e05ac466869c87c8d69dffeb4: Status 404 returned error can't find the container with id 555cbb8f753487898528303c8b1087910a511d4e05ac466869c87c8d69dffeb4 Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.208992 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.318602 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d1908a5-f826-4eae-a6fa-c899dda28b57-config-volume\") pod \"5d1908a5-f826-4eae-a6fa-c899dda28b57\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.318728 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d1908a5-f826-4eae-a6fa-c899dda28b57-secret-volume\") pod \"5d1908a5-f826-4eae-a6fa-c899dda28b57\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.318772 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjk9w\" (UniqueName: \"kubernetes.io/projected/5d1908a5-f826-4eae-a6fa-c899dda28b57-kube-api-access-qjk9w\") pod \"5d1908a5-f826-4eae-a6fa-c899dda28b57\" (UID: \"5d1908a5-f826-4eae-a6fa-c899dda28b57\") " Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.319658 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1908a5-f826-4eae-a6fa-c899dda28b57-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d1908a5-f826-4eae-a6fa-c899dda28b57" (UID: "5d1908a5-f826-4eae-a6fa-c899dda28b57"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.326314 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.329349 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1908a5-f826-4eae-a6fa-c899dda28b57-kube-api-access-qjk9w" (OuterVolumeSpecName: "kube-api-access-qjk9w") pod "5d1908a5-f826-4eae-a6fa-c899dda28b57" (UID: "5d1908a5-f826-4eae-a6fa-c899dda28b57"). InnerVolumeSpecName "kube-api-access-qjk9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.330714 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1908a5-f826-4eae-a6fa-c899dda28b57-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5d1908a5-f826-4eae-a6fa-c899dda28b57" (UID: "5d1908a5-f826-4eae-a6fa-c899dda28b57"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.387245 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.390165 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-28snx" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.420049 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c43677-5742-41b3-9f77-2c22d965a7e6-kube-api-access\") pod \"c6c43677-5742-41b3-9f77-2c22d965a7e6\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.420668 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c43677-5742-41b3-9f77-2c22d965a7e6-kubelet-dir\") pod \"c6c43677-5742-41b3-9f77-2c22d965a7e6\" (UID: \"c6c43677-5742-41b3-9f77-2c22d965a7e6\") " Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.421159 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d1908a5-f826-4eae-a6fa-c899dda28b57-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.421278 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d1908a5-f826-4eae-a6fa-c899dda28b57-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.421307 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjk9w\" (UniqueName: \"kubernetes.io/projected/5d1908a5-f826-4eae-a6fa-c899dda28b57-kube-api-access-qjk9w\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.421361 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6c43677-5742-41b3-9f77-2c22d965a7e6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c6c43677-5742-41b3-9f77-2c22d965a7e6" (UID: "c6c43677-5742-41b3-9f77-2c22d965a7e6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.423821 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c43677-5742-41b3-9f77-2c22d965a7e6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c6c43677-5742-41b3-9f77-2c22d965a7e6" (UID: "c6c43677-5742-41b3-9f77-2c22d965a7e6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.525540 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6c43677-5742-41b3-9f77-2c22d965a7e6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.525571 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6c43677-5742-41b3-9f77-2c22d965a7e6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.726792 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6c43677-5742-41b3-9f77-2c22d965a7e6","Type":"ContainerDied","Data":"a5d59bce631a170204a420cb24044172e349bc0af424191d4890a7b172096d04"} Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.726851 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d59bce631a170204a420cb24044172e349bc0af424191d4890a7b172096d04" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.726855 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.731055 4725 generic.go:334] "Generic (PLEG): container finished" podID="a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7" containerID="6187496e64e72ac166d90f1b0c507d0d23a77fec763f2cf68d60dc5da1c309e7" exitCode=0 Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.731241 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7","Type":"ContainerDied","Data":"6187496e64e72ac166d90f1b0c507d0d23a77fec763f2cf68d60dc5da1c309e7"} Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.735571 4725 generic.go:334] "Generic (PLEG): container finished" podID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerID="dc986bc006ee77e08888a5e21c7000e3eb3ea695456447930409a40d6341c895" exitCode=0 Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.735632 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7q5w4" event={"ID":"c54a518b-2ef3-4edc-9148-80dd4485fc90","Type":"ContainerDied","Data":"dc986bc006ee77e08888a5e21c7000e3eb3ea695456447930409a40d6341c895"} Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.735651 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7q5w4" event={"ID":"c54a518b-2ef3-4edc-9148-80dd4485fc90","Type":"ContainerStarted","Data":"555cbb8f753487898528303c8b1087910a511d4e05ac466869c87c8d69dffeb4"} Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.743402 4725 generic.go:334] "Generic (PLEG): container finished" podID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerID="416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5" exitCode=0 Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.743593 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86fp7" event={"ID":"1295a124-164c-403c-8eb6-f71c3a9dc8a7","Type":"ContainerDied","Data":"416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5"} Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.744664 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86fp7" event={"ID":"1295a124-164c-403c-8eb6-f71c3a9dc8a7","Type":"ContainerStarted","Data":"5ec958d1581966cf1613ed6f2bc01e41c40a90cf2eaa2430593de66b1eb27efc"} Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.767486 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.770316 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp" event={"ID":"5d1908a5-f826-4eae-a6fa-c899dda28b57","Type":"ContainerDied","Data":"6a71e22eb325708ee42279eec4cbbad7ffc73096e7d1b79b104749db7b4168d7"} Feb 27 06:14:08 crc kubenswrapper[4725]: I0227 06:14:08.770363 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a71e22eb325708ee42279eec4cbbad7ffc73096e7d1b79b104749db7b4168d7" Feb 27 06:14:09 crc kubenswrapper[4725]: I0227 06:14:09.987832 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54930: no serving certificate available for the kubelet" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.197158 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.265715 4725 ???:1] "http: TLS handshake error from 192.168.126.11:54940: no serving certificate available for the kubelet" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.366696 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kube-api-access\") pod \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.367053 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kubelet-dir\") pod \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\" (UID: \"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7\") " Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.367275 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7" (UID: "a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.376656 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7" (UID: "a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.469939 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.469967 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.830631 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7","Type":"ContainerDied","Data":"a74a7e4a111e78515451fe7df651638cdbfada29cd1dc03c3988f04bb992204c"} Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.830674 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a74a7e4a111e78515451fe7df651638cdbfada29cd1dc03c3988f04bb992204c" Feb 27 06:14:10 crc kubenswrapper[4725]: I0227 06:14:10.830741 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 06:14:11 crc kubenswrapper[4725]: I0227 06:14:11.589100 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pmdnc" Feb 27 06:14:15 crc kubenswrapper[4725]: I0227 06:14:15.663843 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-b58nc" Feb 27 06:14:16 crc kubenswrapper[4725]: I0227 06:14:16.059346 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:14:16 crc kubenswrapper[4725]: I0227 06:14:16.066064 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.314133 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c86f7b94d-mqkrx"] Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.314553 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" podUID="61f0c8ed-3e65-4e64-8806-1131c31ca4c3" containerName="controller-manager" containerID="cri-o://7c2cb0d82e4141370233045c086b0f95edbaab0c6df5fabf0b6139f19153cd3c" gracePeriod=30 Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.341555 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr"] Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.341757 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" podUID="7cf2fd61-154b-4d80-b6c5-add580271096" containerName="route-controller-manager" containerID="cri-o://a48027c83e457a5aae688b12c8a12ecd80e64b547b5f3f9bc9bea855fc7e0118" gracePeriod=30 Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.905684 4725 generic.go:334] "Generic (PLEG): container finished" podID="61f0c8ed-3e65-4e64-8806-1131c31ca4c3" containerID="7c2cb0d82e4141370233045c086b0f95edbaab0c6df5fabf0b6139f19153cd3c" exitCode=0 Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.905752 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" event={"ID":"61f0c8ed-3e65-4e64-8806-1131c31ca4c3","Type":"ContainerDied","Data":"7c2cb0d82e4141370233045c086b0f95edbaab0c6df5fabf0b6139f19153cd3c"} Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.907897 4725 generic.go:334] "Generic (PLEG): container finished" podID="7cf2fd61-154b-4d80-b6c5-add580271096" containerID="a48027c83e457a5aae688b12c8a12ecd80e64b547b5f3f9bc9bea855fc7e0118" exitCode=0 Feb 27 06:14:19 crc kubenswrapper[4725]: I0227 06:14:19.907943 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" event={"ID":"7cf2fd61-154b-4d80-b6c5-add580271096","Type":"ContainerDied","Data":"a48027c83e457a5aae688b12c8a12ecd80e64b547b5f3f9bc9bea855fc7e0118"} Feb 27 06:14:20 crc kubenswrapper[4725]: I0227 06:14:20.267450 4725 ???:1] "http: TLS handshake error from 192.168.126.11:52284: no serving certificate available for the kubelet" Feb 27 06:14:23 crc kubenswrapper[4725]: I0227 06:14:23.790724 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:14:23 crc kubenswrapper[4725]: I0227 06:14:23.793380 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 06:14:23 crc kubenswrapper[4725]: I0227 06:14:23.808863 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea1b7fec-c4c1-4ae5-a74a-8396d6428900-metrics-certs\") pod \"network-metrics-daemon-vcl2g\" (UID: \"ea1b7fec-c4c1-4ae5-a74a-8396d6428900\") " pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:14:23 crc kubenswrapper[4725]: I0227 06:14:23.913118 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 06:14:23 crc kubenswrapper[4725]: I0227 06:14:23.921662 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vcl2g" Feb 27 06:14:24 crc kubenswrapper[4725]: I0227 06:14:24.550106 4725 patch_prober.go:28] interesting pod/route-controller-manager-6985668574-8g7qr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 06:14:24 crc kubenswrapper[4725]: I0227 06:14:24.550189 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" podUID="7cf2fd61-154b-4d80-b6c5-add580271096" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 06:14:25 crc kubenswrapper[4725]: I0227 06:14:25.010042 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.326813 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.332390 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.351220 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r"] Feb 27 06:14:26 crc kubenswrapper[4725]: E0227 06:14:26.351680 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c43677-5742-41b3-9f77-2c22d965a7e6" containerName="pruner" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.351779 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c43677-5742-41b3-9f77-2c22d965a7e6" containerName="pruner" Feb 27 06:14:26 crc kubenswrapper[4725]: E0227 06:14:26.351863 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1908a5-f826-4eae-a6fa-c899dda28b57" containerName="collect-profiles" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.351923 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1908a5-f826-4eae-a6fa-c899dda28b57" containerName="collect-profiles" Feb 27 06:14:26 crc kubenswrapper[4725]: E0227 06:14:26.351981 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7" containerName="pruner" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352038 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7" containerName="pruner" Feb 27 06:14:26 crc kubenswrapper[4725]: E0227 06:14:26.352098 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf2fd61-154b-4d80-b6c5-add580271096" containerName="route-controller-manager" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352149 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf2fd61-154b-4d80-b6c5-add580271096" containerName="route-controller-manager" Feb 27 06:14:26 crc kubenswrapper[4725]: E0227 06:14:26.352206 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f0c8ed-3e65-4e64-8806-1131c31ca4c3" containerName="controller-manager" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352268 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f0c8ed-3e65-4e64-8806-1131c31ca4c3" containerName="controller-manager" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352436 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf2fd61-154b-4d80-b6c5-add580271096" containerName="route-controller-manager" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352496 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d9ed0e-8ac7-4835-bf4d-ecb030a78fd7" containerName="pruner" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352571 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1908a5-f826-4eae-a6fa-c899dda28b57" containerName="collect-profiles" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352799 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c43677-5742-41b3-9f77-2c22d965a7e6" containerName="pruner" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.352856 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f0c8ed-3e65-4e64-8806-1131c31ca4c3" containerName="controller-manager" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.353413 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.360170 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r"] Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.450147 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf2fd61-154b-4d80-b6c5-add580271096-serving-cert\") pod \"7cf2fd61-154b-4d80-b6c5-add580271096\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.450196 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js26w\" (UniqueName: \"kubernetes.io/projected/7cf2fd61-154b-4d80-b6c5-add580271096-kube-api-access-js26w\") pod \"7cf2fd61-154b-4d80-b6c5-add580271096\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.450313 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-config\") pod \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.450932 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8drmb\" (UniqueName: \"kubernetes.io/projected/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-kube-api-access-8drmb\") pod \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.450997 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-client-ca\") pod \"7cf2fd61-154b-4d80-b6c5-add580271096\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451059 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-config\") pod \"7cf2fd61-154b-4d80-b6c5-add580271096\" (UID: \"7cf2fd61-154b-4d80-b6c5-add580271096\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451102 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-proxy-ca-bundles\") pod \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451120 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-serving-cert\") pod \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451147 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-client-ca\") pod \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\" (UID: \"61f0c8ed-3e65-4e64-8806-1131c31ca4c3\") " Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451305 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-serving-cert\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451351 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcrv\" (UniqueName: \"kubernetes.io/projected/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-kube-api-access-fgcrv\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451381 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-client-ca\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451403 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-config\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451487 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-config" (OuterVolumeSpecName: "config") pod "61f0c8ed-3e65-4e64-8806-1131c31ca4c3" (UID: "61f0c8ed-3e65-4e64-8806-1131c31ca4c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451597 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-client-ca" (OuterVolumeSpecName: "client-ca") pod "7cf2fd61-154b-4d80-b6c5-add580271096" (UID: "7cf2fd61-154b-4d80-b6c5-add580271096"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451850 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "61f0c8ed-3e65-4e64-8806-1131c31ca4c3" (UID: "61f0c8ed-3e65-4e64-8806-1131c31ca4c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.451906 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-config" (OuterVolumeSpecName: "config") pod "7cf2fd61-154b-4d80-b6c5-add580271096" (UID: "7cf2fd61-154b-4d80-b6c5-add580271096"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.452115 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "61f0c8ed-3e65-4e64-8806-1131c31ca4c3" (UID: "61f0c8ed-3e65-4e64-8806-1131c31ca4c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.456566 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-kube-api-access-8drmb" (OuterVolumeSpecName: "kube-api-access-8drmb") pod "61f0c8ed-3e65-4e64-8806-1131c31ca4c3" (UID: "61f0c8ed-3e65-4e64-8806-1131c31ca4c3"). InnerVolumeSpecName "kube-api-access-8drmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.459315 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf2fd61-154b-4d80-b6c5-add580271096-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7cf2fd61-154b-4d80-b6c5-add580271096" (UID: "7cf2fd61-154b-4d80-b6c5-add580271096"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.459624 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "61f0c8ed-3e65-4e64-8806-1131c31ca4c3" (UID: "61f0c8ed-3e65-4e64-8806-1131c31ca4c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.462978 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf2fd61-154b-4d80-b6c5-add580271096-kube-api-access-js26w" (OuterVolumeSpecName: "kube-api-access-js26w") pod "7cf2fd61-154b-4d80-b6c5-add580271096" (UID: "7cf2fd61-154b-4d80-b6c5-add580271096"). InnerVolumeSpecName "kube-api-access-js26w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552565 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-serving-cert\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552627 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcrv\" (UniqueName: \"kubernetes.io/projected/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-kube-api-access-fgcrv\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552661 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-client-ca\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552688 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-config\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552760 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552771 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf2fd61-154b-4d80-b6c5-add580271096-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552782 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552791 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552800 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552808 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf2fd61-154b-4d80-b6c5-add580271096-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552816 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js26w\" (UniqueName: \"kubernetes.io/projected/7cf2fd61-154b-4d80-b6c5-add580271096-kube-api-access-js26w\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552824 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.552834 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8drmb\" (UniqueName: \"kubernetes.io/projected/61f0c8ed-3e65-4e64-8806-1131c31ca4c3-kube-api-access-8drmb\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.553902 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-client-ca\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.554003 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-config\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.561383 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-serving-cert\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.571860 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcrv\" (UniqueName: \"kubernetes.io/projected/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-kube-api-access-fgcrv\") pod \"route-controller-manager-5cbf66fc5f-kwx8r\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.674747 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.865645 4725 patch_prober.go:28] interesting pod/controller-manager-c86f7b94d-mqkrx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" start-of-body= Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.865702 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" podUID="61f0c8ed-3e65-4e64-8806-1131c31ca4c3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.955108 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" event={"ID":"7cf2fd61-154b-4d80-b6c5-add580271096","Type":"ContainerDied","Data":"62fb44b7d515ef6918696e3acbb7772963a26ac39499e1409120d2996442928d"} Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.955176 4725 scope.go:117] "RemoveContainer" containerID="a48027c83e457a5aae688b12c8a12ecd80e64b547b5f3f9bc9bea855fc7e0118" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.955241 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.957641 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" event={"ID":"61f0c8ed-3e65-4e64-8806-1131c31ca4c3","Type":"ContainerDied","Data":"cda051a38bbe960a8fb4c0e5a72992939190e9d1ac336baaab646697ce203efb"} Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.957754 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c86f7b94d-mqkrx" Feb 27 06:14:26 crc kubenswrapper[4725]: I0227 06:14:26.989243 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr"] Feb 27 06:14:27 crc kubenswrapper[4725]: I0227 06:14:27.011509 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985668574-8g7qr"] Feb 27 06:14:27 crc kubenswrapper[4725]: I0227 06:14:27.022884 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c86f7b94d-mqkrx"] Feb 27 06:14:27 crc kubenswrapper[4725]: I0227 06:14:27.028809 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c86f7b94d-mqkrx"] Feb 27 06:14:28 crc kubenswrapper[4725]: I0227 06:14:28.258984 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f0c8ed-3e65-4e64-8806-1131c31ca4c3" path="/var/lib/kubelet/pods/61f0c8ed-3e65-4e64-8806-1131c31ca4c3/volumes" Feb 27 06:14:28 crc kubenswrapper[4725]: I0227 06:14:28.260505 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf2fd61-154b-4d80-b6c5-add580271096" path="/var/lib/kubelet/pods/7cf2fd61-154b-4d80-b6c5-add580271096/volumes" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.240426 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf"] Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.243458 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.248129 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.249866 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf"] Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.252307 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.252473 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.252603 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.252747 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.258728 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.269686 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.393348 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-proxy-ca-bundles\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.393495 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-client-ca\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.393538 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-config\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.393700 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqhfg\" (UniqueName: \"kubernetes.io/projected/1a8e322f-7135-4e94-93ea-b53a6457f11a-kube-api-access-bqhfg\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.394002 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8e322f-7135-4e94-93ea-b53a6457f11a-serving-cert\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.495050 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-config\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.495089 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqhfg\" (UniqueName: \"kubernetes.io/projected/1a8e322f-7135-4e94-93ea-b53a6457f11a-kube-api-access-bqhfg\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.495143 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8e322f-7135-4e94-93ea-b53a6457f11a-serving-cert\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.495169 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-proxy-ca-bundles\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.495207 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-client-ca\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.497742 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-config\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.497892 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-client-ca\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.506760 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-proxy-ca-bundles\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.507319 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8e322f-7135-4e94-93ea-b53a6457f11a-serving-cert\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.511355 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqhfg\" (UniqueName: \"kubernetes.io/projected/1a8e322f-7135-4e94-93ea-b53a6457f11a-kube-api-access-bqhfg\") pod \"controller-manager-6bfd6bc94f-ldbpf\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:29 crc kubenswrapper[4725]: I0227 06:14:29.665601 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.276447 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.276967 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:14:30 crc kubenswrapper[4725]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 06:14:30 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s2v98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536212-627nh_openshift-infra(234512e0-3471-4bd8-b783-6df7b63f2cfe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 27 06:14:30 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.278510 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29536212-627nh" podUID="234512e0-3471-4bd8-b783-6df7b63f2cfe" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.310209 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.310754 4725 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 06:14:30 crc kubenswrapper[4725]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 06:14:30 crc kubenswrapper[4725]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8nnb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536214-p7k5h_openshift-infra(06c0abcb-fb62-4f62-b73e-a27620de9add): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 27 06:14:30 crc kubenswrapper[4725]: > logger="UnhandledError" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.311937 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" podUID="06c0abcb-fb62-4f62-b73e-a27620de9add" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.983057 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536212-627nh" podUID="234512e0-3471-4bd8-b783-6df7b63f2cfe" Feb 27 06:14:30 crc kubenswrapper[4725]: E0227 06:14:30.983107 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" podUID="06c0abcb-fb62-4f62-b73e-a27620de9add" Feb 27 06:14:32 crc kubenswrapper[4725]: I0227 06:14:32.553949 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:14:32 crc kubenswrapper[4725]: I0227 06:14:32.554218 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:14:33 crc kubenswrapper[4725]: E0227 06:14:33.895827 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 06:14:33 crc kubenswrapper[4725]: E0227 06:14:33.896271 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jv9fh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7q5w4_openshift-marketplace(c54a518b-2ef3-4edc-9148-80dd4485fc90): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:14:33 crc kubenswrapper[4725]: E0227 06:14:33.897450 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7q5w4" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" Feb 27 06:14:35 crc kubenswrapper[4725]: E0227 06:14:35.070869 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7q5w4" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" Feb 27 06:14:35 crc kubenswrapper[4725]: E0227 06:14:35.145930 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 06:14:35 crc kubenswrapper[4725]: E0227 06:14:35.146537 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l58ng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lblv9_openshift-marketplace(2079d9d5-1660-4e80-a909-40d68fbe3c87): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:14:35 crc kubenswrapper[4725]: E0227 06:14:35.148069 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lblv9" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" Feb 27 06:14:36 crc kubenswrapper[4725]: E0227 06:14:36.777676 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lblv9" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" Feb 27 06:14:36 crc kubenswrapper[4725]: I0227 06:14:36.803157 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-497vv" Feb 27 06:14:36 crc kubenswrapper[4725]: E0227 06:14:36.857074 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 06:14:36 crc kubenswrapper[4725]: E0227 06:14:36.857409 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zv7j4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-htnhk_openshift-marketplace(4e0ac478-aa78-481d-84d3-f4a5c6bedadb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:14:36 crc kubenswrapper[4725]: E0227 06:14:36.858950 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-htnhk" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" Feb 27 06:14:37 crc kubenswrapper[4725]: I0227 06:14:37.282514 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 06:14:38 crc kubenswrapper[4725]: E0227 06:14:38.926167 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-htnhk" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" Feb 27 06:14:38 crc kubenswrapper[4725]: I0227 06:14:38.933578 4725 scope.go:117] "RemoveContainer" containerID="7c2cb0d82e4141370233045c086b0f95edbaab0c6df5fabf0b6139f19153cd3c" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.000938 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.001156 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kz9d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-b6cpp_openshift-marketplace(c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.002365 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-b6cpp" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.026162 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.026394 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w2bg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-j6hj6_openshift-marketplace(3bfc8e5f-5a0f-4384-a2af-0817928d8ba5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.028368 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-j6hj6" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.040146 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.040316 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wwv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2q5b8_openshift-marketplace(c008fcf9-f898-434a-b077-f8921e01be05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.041533 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2q5b8" podUID="c008fcf9-f898-434a-b077-f8921e01be05" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.041597 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.041698 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j55ss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8rj5q_openshift-marketplace(3965afd0-6cf4-4ea2-86a1-ce69bb98f260): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.043044 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8rj5q" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.055737 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-j6hj6" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" Feb 27 06:14:39 crc kubenswrapper[4725]: E0227 06:14:39.055822 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-b6cpp" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.199588 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.201166 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.203226 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.203349 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.211164 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.302696 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf"] Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.341664 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0afde6d5-bf16-4603-a231-6b305648d72f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.341735 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0afde6d5-bf16-4603-a231-6b305648d72f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.367251 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vcl2g"] Feb 27 06:14:39 crc kubenswrapper[4725]: W0227 06:14:39.372034 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea1b7fec_c4c1_4ae5_a74a_8396d6428900.slice/crio-68cfcf5e59e3a87f85ea6df96aa14ec99844b18ee9cfb99af46a1cd6c6bd97cb WatchSource:0}: Error finding container 68cfcf5e59e3a87f85ea6df96aa14ec99844b18ee9cfb99af46a1cd6c6bd97cb: Status 404 returned error can't find the container with id 68cfcf5e59e3a87f85ea6df96aa14ec99844b18ee9cfb99af46a1cd6c6bd97cb Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.402817 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r"] Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.432632 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r"] Feb 27 06:14:39 crc kubenswrapper[4725]: W0227 06:14:39.438980 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb37bfec_0e55_46f9_bd0c_91cd4be675c9.slice/crio-61d3d0573e6d138df358397834479a89be7de9d4cdcb582acd0906070fb43f8c WatchSource:0}: Error finding container 61d3d0573e6d138df358397834479a89be7de9d4cdcb582acd0906070fb43f8c: Status 404 returned error can't find the container with id 61d3d0573e6d138df358397834479a89be7de9d4cdcb582acd0906070fb43f8c Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.442560 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0afde6d5-bf16-4603-a231-6b305648d72f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.442633 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0afde6d5-bf16-4603-a231-6b305648d72f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.442837 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0afde6d5-bf16-4603-a231-6b305648d72f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.450396 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf"] Feb 27 06:14:39 crc kubenswrapper[4725]: W0227 06:14:39.460497 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a8e322f_7135_4e94_93ea_b53a6457f11a.slice/crio-44e2096e74673af4618b7c3dbc82ad955b238e1c03fa21e9df967d4969c93cc4 WatchSource:0}: Error finding container 44e2096e74673af4618b7c3dbc82ad955b238e1c03fa21e9df967d4969c93cc4: Status 404 returned error can't find the container with id 44e2096e74673af4618b7c3dbc82ad955b238e1c03fa21e9df967d4969c93cc4 Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.467251 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0afde6d5-bf16-4603-a231-6b305648d72f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:39 crc kubenswrapper[4725]: I0227 06:14:39.532317 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.002257 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 06:14:40 crc kubenswrapper[4725]: W0227 06:14:40.040225 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0afde6d5_bf16_4603_a231_6b305648d72f.slice/crio-5a2607a277194bd41b062ec3286b45c09d37d5fba0e60fef919d84ad44193b15 WatchSource:0}: Error finding container 5a2607a277194bd41b062ec3286b45c09d37d5fba0e60fef919d84ad44193b15: Status 404 returned error can't find the container with id 5a2607a277194bd41b062ec3286b45c09d37d5fba0e60fef919d84ad44193b15 Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.061389 4725 generic.go:334] "Generic (PLEG): container finished" podID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerID="6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9" exitCode=0 Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.061486 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86fp7" event={"ID":"1295a124-164c-403c-8eb6-f71c3a9dc8a7","Type":"ContainerDied","Data":"6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9"} Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.063726 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" event={"ID":"cb37bfec-0e55-46f9-bd0c-91cd4be675c9","Type":"ContainerStarted","Data":"32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed"} Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.063771 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" event={"ID":"cb37bfec-0e55-46f9-bd0c-91cd4be675c9","Type":"ContainerStarted","Data":"61d3d0573e6d138df358397834479a89be7de9d4cdcb582acd0906070fb43f8c"} Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.063792 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" podUID="cb37bfec-0e55-46f9-bd0c-91cd4be675c9" containerName="route-controller-manager" containerID="cri-o://32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed" gracePeriod=30 Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.063902 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.071011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0afde6d5-bf16-4603-a231-6b305648d72f","Type":"ContainerStarted","Data":"5a2607a277194bd41b062ec3286b45c09d37d5fba0e60fef919d84ad44193b15"} Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.075970 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" podUID="1a8e322f-7135-4e94-93ea-b53a6457f11a" containerName="controller-manager" containerID="cri-o://466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31" gracePeriod=30 Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.076065 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" event={"ID":"1a8e322f-7135-4e94-93ea-b53a6457f11a","Type":"ContainerStarted","Data":"466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31"} Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.076093 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" event={"ID":"1a8e322f-7135-4e94-93ea-b53a6457f11a","Type":"ContainerStarted","Data":"44e2096e74673af4618b7c3dbc82ad955b238e1c03fa21e9df967d4969c93cc4"} Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.076424 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.080809 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" event={"ID":"ea1b7fec-c4c1-4ae5-a74a-8396d6428900","Type":"ContainerStarted","Data":"77d4238beb99788e022c24b3d9bdc970f20398705133b5b8920eb83ca18213c7"} Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.081139 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" event={"ID":"ea1b7fec-c4c1-4ae5-a74a-8396d6428900","Type":"ContainerStarted","Data":"68cfcf5e59e3a87f85ea6df96aa14ec99844b18ee9cfb99af46a1cd6c6bd97cb"} Feb 27 06:14:40 crc kubenswrapper[4725]: E0227 06:14:40.083830 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2q5b8" podUID="c008fcf9-f898-434a-b077-f8921e01be05" Feb 27 06:14:40 crc kubenswrapper[4725]: E0227 06:14:40.083900 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8rj5q" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.086268 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.223127 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" podStartSLOduration=21.223111432 podStartE2EDuration="21.223111432s" podCreationTimestamp="2026-02-27 06:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:40.220080498 +0000 UTC m=+258.682701077" watchObservedRunningTime="2026-02-27 06:14:40.223111432 +0000 UTC m=+258.685732001" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.224741 4725 patch_prober.go:28] interesting pod/route-controller-manager-5cbf66fc5f-kwx8r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:40554->10.217.0.58:8443: read: connection reset by peer" start-of-body= Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.224800 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" podUID="cb37bfec-0e55-46f9-bd0c-91cd4be675c9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:40554->10.217.0.58:8443: read: connection reset by peer" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.525899 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5cbf66fc5f-kwx8r_cb37bfec-0e55-46f9-bd0c-91cd4be675c9/route-controller-manager/0.log" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.526008 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.552329 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j"] Feb 27 06:14:40 crc kubenswrapper[4725]: E0227 06:14:40.552674 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb37bfec-0e55-46f9-bd0c-91cd4be675c9" containerName="route-controller-manager" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.552691 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb37bfec-0e55-46f9-bd0c-91cd4be675c9" containerName="route-controller-manager" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.552801 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb37bfec-0e55-46f9-bd0c-91cd4be675c9" containerName="route-controller-manager" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.553320 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.567684 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j"] Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.676911 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-config\") pod \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.677178 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-serving-cert\") pod \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.677324 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-client-ca\") pod \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.677450 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgcrv\" (UniqueName: \"kubernetes.io/projected/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-kube-api-access-fgcrv\") pod \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\" (UID: \"cb37bfec-0e55-46f9-bd0c-91cd4be675c9\") " Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.677693 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxkjn\" (UniqueName: \"kubernetes.io/projected/4afb07af-069f-4095-a8ca-10dee0be3a48-kube-api-access-lxkjn\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.677826 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4afb07af-069f-4095-a8ca-10dee0be3a48-serving-cert\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.677919 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-config\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.678008 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-client-ca\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.677955 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "cb37bfec-0e55-46f9-bd0c-91cd4be675c9" (UID: "cb37bfec-0e55-46f9-bd0c-91cd4be675c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.678010 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-config" (OuterVolumeSpecName: "config") pod "cb37bfec-0e55-46f9-bd0c-91cd4be675c9" (UID: "cb37bfec-0e55-46f9-bd0c-91cd4be675c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.682179 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-kube-api-access-fgcrv" (OuterVolumeSpecName: "kube-api-access-fgcrv") pod "cb37bfec-0e55-46f9-bd0c-91cd4be675c9" (UID: "cb37bfec-0e55-46f9-bd0c-91cd4be675c9"). InnerVolumeSpecName "kube-api-access-fgcrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.682695 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cb37bfec-0e55-46f9-bd0c-91cd4be675c9" (UID: "cb37bfec-0e55-46f9-bd0c-91cd4be675c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.779957 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxkjn\" (UniqueName: \"kubernetes.io/projected/4afb07af-069f-4095-a8ca-10dee0be3a48-kube-api-access-lxkjn\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.780013 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4afb07af-069f-4095-a8ca-10dee0be3a48-serving-cert\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.780044 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-config\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.780070 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-client-ca\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.780126 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.780137 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgcrv\" (UniqueName: \"kubernetes.io/projected/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-kube-api-access-fgcrv\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.780192 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.780201 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb37bfec-0e55-46f9-bd0c-91cd4be675c9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.781022 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-client-ca\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.781433 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-config\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.787899 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4afb07af-069f-4095-a8ca-10dee0be3a48-serving-cert\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.794987 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxkjn\" (UniqueName: \"kubernetes.io/projected/4afb07af-069f-4095-a8ca-10dee0be3a48-kube-api-access-lxkjn\") pod \"route-controller-manager-79668c5b4-vf95j\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:40 crc kubenswrapper[4725]: I0227 06:14:40.872589 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.076734 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.091630 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5cbf66fc5f-kwx8r_cb37bfec-0e55-46f9-bd0c-91cd4be675c9/route-controller-manager/0.log" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.091705 4725 generic.go:334] "Generic (PLEG): container finished" podID="cb37bfec-0e55-46f9-bd0c-91cd4be675c9" containerID="32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed" exitCode=255 Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.091802 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" event={"ID":"cb37bfec-0e55-46f9-bd0c-91cd4be675c9","Type":"ContainerDied","Data":"32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed"} Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.091837 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" event={"ID":"cb37bfec-0e55-46f9-bd0c-91cd4be675c9","Type":"ContainerDied","Data":"61d3d0573e6d138df358397834479a89be7de9d4cdcb582acd0906070fb43f8c"} Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.091805 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.091857 4725 scope.go:117] "RemoveContainer" containerID="32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.100153 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0afde6d5-bf16-4603-a231-6b305648d72f","Type":"ContainerStarted","Data":"c3843917d8b40523b8762ddc2af0efa370f940d97c051bb03b8203da0160b5b4"} Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.108155 4725 generic.go:334] "Generic (PLEG): container finished" podID="1a8e322f-7135-4e94-93ea-b53a6457f11a" containerID="466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31" exitCode=0 Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.108253 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" event={"ID":"1a8e322f-7135-4e94-93ea-b53a6457f11a","Type":"ContainerDied","Data":"466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31"} Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.108355 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" event={"ID":"1a8e322f-7135-4e94-93ea-b53a6457f11a","Type":"ContainerDied","Data":"44e2096e74673af4618b7c3dbc82ad955b238e1c03fa21e9df967d4969c93cc4"} Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.111663 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.119742 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j"] Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.119917 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vcl2g" event={"ID":"ea1b7fec-c4c1-4ae5-a74a-8396d6428900","Type":"ContainerStarted","Data":"a366a3da3aee05bd3dffe2f3b0b533af976d5341368a2bb7c1f17883259ec2a4"} Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.121015 4725 scope.go:117] "RemoveContainer" containerID="32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed" Feb 27 06:14:41 crc kubenswrapper[4725]: E0227 06:14:41.121409 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed\": container with ID starting with 32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed not found: ID does not exist" containerID="32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.121447 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed"} err="failed to get container status \"32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed\": rpc error: code = NotFound desc = could not find container \"32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed\": container with ID starting with 32bf19beffcf899d03a3751e494cf75ad9ee670a6d3610532868af78df6b79ed not found: ID does not exist" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.121474 4725 scope.go:117] "RemoveContainer" containerID="466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.122958 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.122945454 podStartE2EDuration="2.122945454s" podCreationTimestamp="2026-02-27 06:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:41.119821468 +0000 UTC m=+259.582442037" watchObservedRunningTime="2026-02-27 06:14:41.122945454 +0000 UTC m=+259.585566013" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.135666 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r"] Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.139075 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cbf66fc5f-kwx8r"] Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.152614 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vcl2g" podStartSLOduration=197.152497443 podStartE2EDuration="3m17.152497443s" podCreationTimestamp="2026-02-27 06:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:41.147853465 +0000 UTC m=+259.610474044" watchObservedRunningTime="2026-02-27 06:14:41.152497443 +0000 UTC m=+259.615118012" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.170191 4725 scope.go:117] "RemoveContainer" containerID="466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31" Feb 27 06:14:41 crc kubenswrapper[4725]: E0227 06:14:41.170676 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31\": container with ID starting with 466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31 not found: ID does not exist" containerID="466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.170738 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31"} err="failed to get container status \"466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31\": rpc error: code = NotFound desc = could not find container \"466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31\": container with ID starting with 466aa4579af367ed6dc7a668fb8240dfc31af2d6d62387913cde50910e495d31 not found: ID does not exist" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.187672 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-proxy-ca-bundles\") pod \"1a8e322f-7135-4e94-93ea-b53a6457f11a\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.187748 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqhfg\" (UniqueName: \"kubernetes.io/projected/1a8e322f-7135-4e94-93ea-b53a6457f11a-kube-api-access-bqhfg\") pod \"1a8e322f-7135-4e94-93ea-b53a6457f11a\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.187825 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-client-ca\") pod \"1a8e322f-7135-4e94-93ea-b53a6457f11a\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.187876 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-config\") pod \"1a8e322f-7135-4e94-93ea-b53a6457f11a\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.187899 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8e322f-7135-4e94-93ea-b53a6457f11a-serving-cert\") pod \"1a8e322f-7135-4e94-93ea-b53a6457f11a\" (UID: \"1a8e322f-7135-4e94-93ea-b53a6457f11a\") " Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.188548 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1a8e322f-7135-4e94-93ea-b53a6457f11a" (UID: "1a8e322f-7135-4e94-93ea-b53a6457f11a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.188666 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-client-ca" (OuterVolumeSpecName: "client-ca") pod "1a8e322f-7135-4e94-93ea-b53a6457f11a" (UID: "1a8e322f-7135-4e94-93ea-b53a6457f11a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.189332 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-config" (OuterVolumeSpecName: "config") pod "1a8e322f-7135-4e94-93ea-b53a6457f11a" (UID: "1a8e322f-7135-4e94-93ea-b53a6457f11a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.194332 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8e322f-7135-4e94-93ea-b53a6457f11a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1a8e322f-7135-4e94-93ea-b53a6457f11a" (UID: "1a8e322f-7135-4e94-93ea-b53a6457f11a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.196715 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8e322f-7135-4e94-93ea-b53a6457f11a-kube-api-access-bqhfg" (OuterVolumeSpecName: "kube-api-access-bqhfg") pod "1a8e322f-7135-4e94-93ea-b53a6457f11a" (UID: "1a8e322f-7135-4e94-93ea-b53a6457f11a"). InnerVolumeSpecName "kube-api-access-bqhfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.290104 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.290135 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.290144 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a8e322f-7135-4e94-93ea-b53a6457f11a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.290153 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1a8e322f-7135-4e94-93ea-b53a6457f11a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.290164 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqhfg\" (UniqueName: \"kubernetes.io/projected/1a8e322f-7135-4e94-93ea-b53a6457f11a-kube-api-access-bqhfg\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.495715 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf"] Feb 27 06:14:41 crc kubenswrapper[4725]: I0227 06:14:41.499046 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bfd6bc94f-ldbpf"] Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.131096 4725 generic.go:334] "Generic (PLEG): container finished" podID="0afde6d5-bf16-4603-a231-6b305648d72f" containerID="c3843917d8b40523b8762ddc2af0efa370f940d97c051bb03b8203da0160b5b4" exitCode=0 Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.131154 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0afde6d5-bf16-4603-a231-6b305648d72f","Type":"ContainerDied","Data":"c3843917d8b40523b8762ddc2af0efa370f940d97c051bb03b8203da0160b5b4"} Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.140522 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86fp7" event={"ID":"1295a124-164c-403c-8eb6-f71c3a9dc8a7","Type":"ContainerStarted","Data":"8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639"} Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.142245 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" event={"ID":"4afb07af-069f-4095-a8ca-10dee0be3a48","Type":"ContainerStarted","Data":"60dfa8131fa3acd400ef584cc8ef241f0f4b28c64e0a996d8c8d4842210a03dc"} Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.142324 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" event={"ID":"4afb07af-069f-4095-a8ca-10dee0be3a48","Type":"ContainerStarted","Data":"812a43b444104c638b66de1a734d606e0b59186d68b7016ce131a1f905631cc1"} Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.175005 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86fp7" podStartSLOduration=3.118828564 podStartE2EDuration="36.174983003s" podCreationTimestamp="2026-02-27 06:14:06 +0000 UTC" firstStartedPulling="2026-02-27 06:14:08.746753875 +0000 UTC m=+227.209374444" lastFinishedPulling="2026-02-27 06:14:41.802908314 +0000 UTC m=+260.265528883" observedRunningTime="2026-02-27 06:14:42.169107603 +0000 UTC m=+260.631728182" watchObservedRunningTime="2026-02-27 06:14:42.174983003 +0000 UTC m=+260.637603572" Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.189817 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" podStartSLOduration=3.1897965 podStartE2EDuration="3.1897965s" podCreationTimestamp="2026-02-27 06:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:42.18668657 +0000 UTC m=+260.649307149" watchObservedRunningTime="2026-02-27 06:14:42.1897965 +0000 UTC m=+260.652417069" Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.267777 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8e322f-7135-4e94-93ea-b53a6457f11a" path="/var/lib/kubelet/pods/1a8e322f-7135-4e94-93ea-b53a6457f11a/volumes" Feb 27 06:14:42 crc kubenswrapper[4725]: I0227 06:14:42.268534 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb37bfec-0e55-46f9-bd0c-91cd4be675c9" path="/var/lib/kubelet/pods/cb37bfec-0e55-46f9-bd0c-91cd4be675c9/volumes" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.154089 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.159578 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.244881 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-657744c59d-zszcc"] Feb 27 06:14:43 crc kubenswrapper[4725]: E0227 06:14:43.245088 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8e322f-7135-4e94-93ea-b53a6457f11a" containerName="controller-manager" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.245100 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8e322f-7135-4e94-93ea-b53a6457f11a" containerName="controller-manager" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.245199 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8e322f-7135-4e94-93ea-b53a6457f11a" containerName="controller-manager" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.245539 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.250172 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.250726 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.250974 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.251050 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.251172 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.251237 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.267114 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.270820 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-657744c59d-zszcc"] Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.322427 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-proxy-ca-bundles\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.323067 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vssh\" (UniqueName: \"kubernetes.io/projected/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-kube-api-access-4vssh\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.323142 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-config\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.323166 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-client-ca\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.323205 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-serving-cert\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.420443 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.424666 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vssh\" (UniqueName: \"kubernetes.io/projected/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-kube-api-access-4vssh\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.424736 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-config\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.424761 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-client-ca\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.424787 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-serving-cert\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.424872 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-proxy-ca-bundles\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.426123 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-proxy-ca-bundles\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.426214 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-client-ca\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.426485 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-config\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.433858 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-serving-cert\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.450924 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vssh\" (UniqueName: \"kubernetes.io/projected/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-kube-api-access-4vssh\") pod \"controller-manager-657744c59d-zszcc\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.527595 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0afde6d5-bf16-4603-a231-6b305648d72f-kube-api-access\") pod \"0afde6d5-bf16-4603-a231-6b305648d72f\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.527668 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0afde6d5-bf16-4603-a231-6b305648d72f-kubelet-dir\") pod \"0afde6d5-bf16-4603-a231-6b305648d72f\" (UID: \"0afde6d5-bf16-4603-a231-6b305648d72f\") " Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.528169 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afde6d5-bf16-4603-a231-6b305648d72f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0afde6d5-bf16-4603-a231-6b305648d72f" (UID: "0afde6d5-bf16-4603-a231-6b305648d72f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.531918 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afde6d5-bf16-4603-a231-6b305648d72f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0afde6d5-bf16-4603-a231-6b305648d72f" (UID: "0afde6d5-bf16-4603-a231-6b305648d72f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.588812 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.629267 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0afde6d5-bf16-4603-a231-6b305648d72f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.629326 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0afde6d5-bf16-4603-a231-6b305648d72f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:43 crc kubenswrapper[4725]: I0227 06:14:43.802944 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-657744c59d-zszcc"] Feb 27 06:14:43 crc kubenswrapper[4725]: W0227 06:14:43.810154 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd95915eb_bfa1_4ee1_9aec_3009a4344f7b.slice/crio-65c307e2f9a7f04c7178900aefbdcfcd946e91a91e901a8a92998bfc472a352b WatchSource:0}: Error finding container 65c307e2f9a7f04c7178900aefbdcfcd946e91a91e901a8a92998bfc472a352b: Status 404 returned error can't find the container with id 65c307e2f9a7f04c7178900aefbdcfcd946e91a91e901a8a92998bfc472a352b Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.166733 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0afde6d5-bf16-4603-a231-6b305648d72f","Type":"ContainerDied","Data":"5a2607a277194bd41b062ec3286b45c09d37d5fba0e60fef919d84ad44193b15"} Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.166775 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a2607a277194bd41b062ec3286b45c09d37d5fba0e60fef919d84ad44193b15" Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.166832 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.179835 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" event={"ID":"d95915eb-bfa1-4ee1-9aec-3009a4344f7b","Type":"ContainerStarted","Data":"63a8ff098c0cf49e2eb59db92e6b024fa7880c2cfb76b1c8437bd5fe8dba4809"} Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.179926 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" event={"ID":"d95915eb-bfa1-4ee1-9aec-3009a4344f7b","Type":"ContainerStarted","Data":"65c307e2f9a7f04c7178900aefbdcfcd946e91a91e901a8a92998bfc472a352b"} Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.179966 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.196239 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.216955 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" podStartSLOduration=5.216938872 podStartE2EDuration="5.216938872s" podCreationTimestamp="2026-02-27 06:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:44.216548281 +0000 UTC m=+262.679168860" watchObservedRunningTime="2026-02-27 06:14:44.216938872 +0000 UTC m=+262.679559441" Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.817629 4725 csr.go:261] certificate signing request csr-l68pg is approved, waiting to be issued Feb 27 06:14:44 crc kubenswrapper[4725]: I0227 06:14:44.822468 4725 csr.go:257] certificate signing request csr-l68pg is issued Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.182121 4725 generic.go:334] "Generic (PLEG): container finished" podID="06c0abcb-fb62-4f62-b73e-a27620de9add" containerID="76c19525ba025394fb52d83c22e1eb7190d94dd6551569de289c5d63f566f2af" exitCode=0 Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.182433 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" event={"ID":"06c0abcb-fb62-4f62-b73e-a27620de9add","Type":"ContainerDied","Data":"76c19525ba025394fb52d83c22e1eb7190d94dd6551569de289c5d63f566f2af"} Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.800438 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 06:14:45 crc kubenswrapper[4725]: E0227 06:14:45.801144 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afde6d5-bf16-4603-a231-6b305648d72f" containerName="pruner" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.801166 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afde6d5-bf16-4603-a231-6b305648d72f" containerName="pruner" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.801368 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afde6d5-bf16-4603-a231-6b305648d72f" containerName="pruner" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.802021 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.804455 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.804479 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.810828 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.824929 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-20 21:18:23.304321062 +0000 UTC Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.824961 4725 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6399h3m37.479362356s for next certificate rotation Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.964084 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.964136 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16976e74-aa71-40ae-a441-adfc92420ac5-kube-api-access\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:45 crc kubenswrapper[4725]: I0227 06:14:45.964209 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-var-lock\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.065127 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.065178 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16976e74-aa71-40ae-a441-adfc92420ac5-kube-api-access\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.065228 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-var-lock\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.065323 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-var-lock\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.065370 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.085281 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16976e74-aa71-40ae-a441-adfc92420ac5-kube-api-access\") pod \"installer-9-crc\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.121193 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.187644 4725 generic.go:334] "Generic (PLEG): container finished" podID="234512e0-3471-4bd8-b783-6df7b63f2cfe" containerID="dec50bf79e349fe9abc1a4b764874fa1b281fe7a54a003aef33b4ec78a2900b6" exitCode=0 Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.188167 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536212-627nh" event={"ID":"234512e0-3471-4bd8-b783-6df7b63f2cfe","Type":"ContainerDied","Data":"dec50bf79e349fe9abc1a4b764874fa1b281fe7a54a003aef33b4ec78a2900b6"} Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.496081 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.549654 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 06:14:46 crc kubenswrapper[4725]: W0227 06:14:46.552543 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod16976e74_aa71_40ae_a441_adfc92420ac5.slice/crio-f1acff55b97fd11bf3d6836d0a1a9a7a07d7473d03b9cdfd70665f351b41108d WatchSource:0}: Error finding container f1acff55b97fd11bf3d6836d0a1a9a7a07d7473d03b9cdfd70665f351b41108d: Status 404 returned error can't find the container with id f1acff55b97fd11bf3d6836d0a1a9a7a07d7473d03b9cdfd70665f351b41108d Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.571829 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nnb4\" (UniqueName: \"kubernetes.io/projected/06c0abcb-fb62-4f62-b73e-a27620de9add-kube-api-access-8nnb4\") pod \"06c0abcb-fb62-4f62-b73e-a27620de9add\" (UID: \"06c0abcb-fb62-4f62-b73e-a27620de9add\") " Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.578388 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c0abcb-fb62-4f62-b73e-a27620de9add-kube-api-access-8nnb4" (OuterVolumeSpecName: "kube-api-access-8nnb4") pod "06c0abcb-fb62-4f62-b73e-a27620de9add" (UID: "06c0abcb-fb62-4f62-b73e-a27620de9add"). InnerVolumeSpecName "kube-api-access-8nnb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.673997 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nnb4\" (UniqueName: \"kubernetes.io/projected/06c0abcb-fb62-4f62-b73e-a27620de9add-kube-api-access-8nnb4\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.856665 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:46 crc kubenswrapper[4725]: I0227 06:14:46.857601 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.197425 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" event={"ID":"06c0abcb-fb62-4f62-b73e-a27620de9add","Type":"ContainerDied","Data":"754f60393f81a0610c0029d04eb82deb299079ff5f6102ec2e926fb2204830d7"} Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.198720 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="754f60393f81a0610c0029d04eb82deb299079ff5f6102ec2e926fb2204830d7" Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.197432 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536214-p7k5h" Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.198985 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"16976e74-aa71-40ae-a441-adfc92420ac5","Type":"ContainerStarted","Data":"f1acff55b97fd11bf3d6836d0a1a9a7a07d7473d03b9cdfd70665f351b41108d"} Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.592973 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536212-627nh" Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.688578 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2v98\" (UniqueName: \"kubernetes.io/projected/234512e0-3471-4bd8-b783-6df7b63f2cfe-kube-api-access-s2v98\") pod \"234512e0-3471-4bd8-b783-6df7b63f2cfe\" (UID: \"234512e0-3471-4bd8-b783-6df7b63f2cfe\") " Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.694050 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234512e0-3471-4bd8-b783-6df7b63f2cfe-kube-api-access-s2v98" (OuterVolumeSpecName: "kube-api-access-s2v98") pod "234512e0-3471-4bd8-b783-6df7b63f2cfe" (UID: "234512e0-3471-4bd8-b783-6df7b63f2cfe"). InnerVolumeSpecName "kube-api-access-s2v98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:14:47 crc kubenswrapper[4725]: I0227 06:14:47.789876 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2v98\" (UniqueName: \"kubernetes.io/projected/234512e0-3471-4bd8-b783-6df7b63f2cfe-kube-api-access-s2v98\") on node \"crc\" DevicePath \"\"" Feb 27 06:14:48 crc kubenswrapper[4725]: I0227 06:14:48.207410 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536212-627nh" event={"ID":"234512e0-3471-4bd8-b783-6df7b63f2cfe","Type":"ContainerDied","Data":"c968e56d6f45f123d0e6f236b3cbbc2384fe6d4f94c88aed5a46791a9d4bfcc8"} Feb 27 06:14:48 crc kubenswrapper[4725]: I0227 06:14:48.207452 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c968e56d6f45f123d0e6f236b3cbbc2384fe6d4f94c88aed5a46791a9d4bfcc8" Feb 27 06:14:48 crc kubenswrapper[4725]: I0227 06:14:48.207456 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536212-627nh" Feb 27 06:14:48 crc kubenswrapper[4725]: I0227 06:14:48.211564 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"16976e74-aa71-40ae-a441-adfc92420ac5","Type":"ContainerStarted","Data":"ab5fd7ef0320fdefbc5ef0ac447d9b8b6f47d750a8c6ad340364f16cad65a4c7"} Feb 27 06:14:48 crc kubenswrapper[4725]: I0227 06:14:48.377467 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-86fp7" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="registry-server" probeResult="failure" output=< Feb 27 06:14:48 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:14:48 crc kubenswrapper[4725]: > Feb 27 06:14:48 crc kubenswrapper[4725]: I0227 06:14:48.626788 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.6267619399999997 podStartE2EDuration="3.62676194s" podCreationTimestamp="2026-02-27 06:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:14:48.251883349 +0000 UTC m=+266.714503938" watchObservedRunningTime="2026-02-27 06:14:48.62676194 +0000 UTC m=+267.089382539" Feb 27 06:14:53 crc kubenswrapper[4725]: I0227 06:14:53.248225 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6hj6" event={"ID":"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5","Type":"ContainerStarted","Data":"59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b"} Feb 27 06:14:53 crc kubenswrapper[4725]: I0227 06:14:53.253196 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7q5w4" event={"ID":"c54a518b-2ef3-4edc-9148-80dd4485fc90","Type":"ContainerStarted","Data":"432e3bb2fff995c298e3e5f73e6564e1b09da1e719b77e2fe017b5193fffa936"} Feb 27 06:14:53 crc kubenswrapper[4725]: I0227 06:14:53.268766 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6cpp" event={"ID":"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd","Type":"ContainerStarted","Data":"55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3"} Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.283189 4725 generic.go:334] "Generic (PLEG): container finished" podID="c008fcf9-f898-434a-b077-f8921e01be05" containerID="e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8" exitCode=0 Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.283297 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q5b8" event={"ID":"c008fcf9-f898-434a-b077-f8921e01be05","Type":"ContainerDied","Data":"e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8"} Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.293993 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rj5q" event={"ID":"3965afd0-6cf4-4ea2-86a1-ce69bb98f260","Type":"ContainerStarted","Data":"52f0f9b342107a6eff68fcaf1b51f97912d79cf477365a713771eac5e8110c1e"} Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.298435 4725 generic.go:334] "Generic (PLEG): container finished" podID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerID="10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9" exitCode=0 Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.298525 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lblv9" event={"ID":"2079d9d5-1660-4e80-a909-40d68fbe3c87","Type":"ContainerDied","Data":"10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9"} Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.317176 4725 generic.go:334] "Generic (PLEG): container finished" podID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerID="432e3bb2fff995c298e3e5f73e6564e1b09da1e719b77e2fe017b5193fffa936" exitCode=0 Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.317324 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7q5w4" event={"ID":"c54a518b-2ef3-4edc-9148-80dd4485fc90","Type":"ContainerDied","Data":"432e3bb2fff995c298e3e5f73e6564e1b09da1e719b77e2fe017b5193fffa936"} Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.322364 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htnhk" event={"ID":"4e0ac478-aa78-481d-84d3-f4a5c6bedadb","Type":"ContainerStarted","Data":"c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e"} Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.336225 4725 generic.go:334] "Generic (PLEG): container finished" podID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerID="55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3" exitCode=0 Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.336413 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6cpp" event={"ID":"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd","Type":"ContainerDied","Data":"55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3"} Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.357716 4725 generic.go:334] "Generic (PLEG): container finished" podID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerID="59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b" exitCode=0 Feb 27 06:14:54 crc kubenswrapper[4725]: I0227 06:14:54.357758 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6hj6" event={"ID":"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5","Type":"ContainerDied","Data":"59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.074959 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zbhxj"] Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.364649 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q5b8" event={"ID":"c008fcf9-f898-434a-b077-f8921e01be05","Type":"ContainerStarted","Data":"73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.365823 4725 generic.go:334] "Generic (PLEG): container finished" podID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerID="52f0f9b342107a6eff68fcaf1b51f97912d79cf477365a713771eac5e8110c1e" exitCode=0 Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.365866 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rj5q" event={"ID":"3965afd0-6cf4-4ea2-86a1-ce69bb98f260","Type":"ContainerDied","Data":"52f0f9b342107a6eff68fcaf1b51f97912d79cf477365a713771eac5e8110c1e"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.368546 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lblv9" event={"ID":"2079d9d5-1660-4e80-a909-40d68fbe3c87","Type":"ContainerStarted","Data":"85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.370999 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7q5w4" event={"ID":"c54a518b-2ef3-4edc-9148-80dd4485fc90","Type":"ContainerStarted","Data":"86dc3f795e4dd01fe17afb556a8f9591d30053b756fc0a6768d252625c8ddb3c"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.373201 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6cpp" event={"ID":"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd","Type":"ContainerStarted","Data":"997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.375109 4725 generic.go:334] "Generic (PLEG): container finished" podID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerID="c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e" exitCode=0 Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.375167 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htnhk" event={"ID":"4e0ac478-aa78-481d-84d3-f4a5c6bedadb","Type":"ContainerDied","Data":"c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.378082 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6hj6" event={"ID":"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5","Type":"ContainerStarted","Data":"6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c"} Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.404145 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2q5b8" podStartSLOduration=2.275730588 podStartE2EDuration="50.404125959s" podCreationTimestamp="2026-02-27 06:14:05 +0000 UTC" firstStartedPulling="2026-02-27 06:14:06.551475629 +0000 UTC m=+225.014096198" lastFinishedPulling="2026-02-27 06:14:54.67987099 +0000 UTC m=+273.142491569" observedRunningTime="2026-02-27 06:14:55.404008616 +0000 UTC m=+273.866629185" watchObservedRunningTime="2026-02-27 06:14:55.404125959 +0000 UTC m=+273.866746528" Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.427386 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lblv9" podStartSLOduration=3.30381247 podStartE2EDuration="50.427348938s" podCreationTimestamp="2026-02-27 06:14:05 +0000 UTC" firstStartedPulling="2026-02-27 06:14:07.619210684 +0000 UTC m=+226.081831253" lastFinishedPulling="2026-02-27 06:14:54.742747122 +0000 UTC m=+273.205367721" observedRunningTime="2026-02-27 06:14:55.426321919 +0000 UTC m=+273.888942488" watchObservedRunningTime="2026-02-27 06:14:55.427348938 +0000 UTC m=+273.889969507" Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.444763 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7q5w4" podStartSLOduration=3.441501114 podStartE2EDuration="49.44474702s" podCreationTimestamp="2026-02-27 06:14:06 +0000 UTC" firstStartedPulling="2026-02-27 06:14:08.738377093 +0000 UTC m=+227.200997662" lastFinishedPulling="2026-02-27 06:14:54.741622969 +0000 UTC m=+273.204243568" observedRunningTime="2026-02-27 06:14:55.443819843 +0000 UTC m=+273.906440412" watchObservedRunningTime="2026-02-27 06:14:55.44474702 +0000 UTC m=+273.907367589" Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.463180 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b6cpp" podStartSLOduration=2.888592484 podStartE2EDuration="52.46316026s" podCreationTimestamp="2026-02-27 06:14:03 +0000 UTC" firstStartedPulling="2026-02-27 06:14:05.29189164 +0000 UTC m=+223.754512209" lastFinishedPulling="2026-02-27 06:14:54.866459396 +0000 UTC m=+273.329079985" observedRunningTime="2026-02-27 06:14:55.461605866 +0000 UTC m=+273.924226435" watchObservedRunningTime="2026-02-27 06:14:55.46316026 +0000 UTC m=+273.925780829" Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.497554 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j6hj6" podStartSLOduration=3.021387291 podStartE2EDuration="52.497535771s" podCreationTimestamp="2026-02-27 06:14:03 +0000 UTC" firstStartedPulling="2026-02-27 06:14:05.417750367 +0000 UTC m=+223.880370926" lastFinishedPulling="2026-02-27 06:14:54.893898827 +0000 UTC m=+273.356519406" observedRunningTime="2026-02-27 06:14:55.494429361 +0000 UTC m=+273.957049940" watchObservedRunningTime="2026-02-27 06:14:55.497535771 +0000 UTC m=+273.960156340" Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.684734 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:55 crc kubenswrapper[4725]: I0227 06:14:55.684909 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.026772 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.026819 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.393481 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rj5q" event={"ID":"3965afd0-6cf4-4ea2-86a1-ce69bb98f260","Type":"ContainerStarted","Data":"6bb935faeceb0c351a755df3aa287ed2f9a5e3e20c10cc581251e572bd996459"} Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.397276 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htnhk" event={"ID":"4e0ac478-aa78-481d-84d3-f4a5c6bedadb","Type":"ContainerStarted","Data":"13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f"} Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.412493 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8rj5q" podStartSLOduration=2.979280819 podStartE2EDuration="53.412474054s" podCreationTimestamp="2026-02-27 06:14:03 +0000 UTC" firstStartedPulling="2026-02-27 06:14:05.344753115 +0000 UTC m=+223.807373684" lastFinishedPulling="2026-02-27 06:14:55.77794634 +0000 UTC m=+274.240566919" observedRunningTime="2026-02-27 06:14:56.410624791 +0000 UTC m=+274.873245370" watchObservedRunningTime="2026-02-27 06:14:56.412474054 +0000 UTC m=+274.875094623" Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.432710 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-htnhk" podStartSLOduration=2.89296656 podStartE2EDuration="53.432667756s" podCreationTimestamp="2026-02-27 06:14:03 +0000 UTC" firstStartedPulling="2026-02-27 06:14:05.283530159 +0000 UTC m=+223.746150728" lastFinishedPulling="2026-02-27 06:14:55.823231355 +0000 UTC m=+274.285851924" observedRunningTime="2026-02-27 06:14:56.429411392 +0000 UTC m=+274.892031951" watchObservedRunningTime="2026-02-27 06:14:56.432667756 +0000 UTC m=+274.895288325" Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.729220 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2q5b8" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="registry-server" probeResult="failure" output=< Feb 27 06:14:56 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:14:56 crc kubenswrapper[4725]: > Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.914424 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:56 crc kubenswrapper[4725]: I0227 06:14:56.965993 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:14:57 crc kubenswrapper[4725]: I0227 06:14:57.108083 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-lblv9" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="registry-server" probeResult="failure" output=< Feb 27 06:14:57 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:14:57 crc kubenswrapper[4725]: > Feb 27 06:14:57 crc kubenswrapper[4725]: I0227 06:14:57.243351 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:57 crc kubenswrapper[4725]: I0227 06:14:57.243407 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:14:58 crc kubenswrapper[4725]: I0227 06:14:58.279923 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7q5w4" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="registry-server" probeResult="failure" output=< Feb 27 06:14:58 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:14:58 crc kubenswrapper[4725]: > Feb 27 06:14:59 crc kubenswrapper[4725]: I0227 06:14:59.308086 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-657744c59d-zszcc"] Feb 27 06:14:59 crc kubenswrapper[4725]: I0227 06:14:59.308353 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" podUID="d95915eb-bfa1-4ee1-9aec-3009a4344f7b" containerName="controller-manager" containerID="cri-o://63a8ff098c0cf49e2eb59db92e6b024fa7880c2cfb76b1c8437bd5fe8dba4809" gracePeriod=30 Feb 27 06:14:59 crc kubenswrapper[4725]: I0227 06:14:59.320657 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j"] Feb 27 06:14:59 crc kubenswrapper[4725]: I0227 06:14:59.320995 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" podUID="4afb07af-069f-4095-a8ca-10dee0be3a48" containerName="route-controller-manager" containerID="cri-o://60dfa8131fa3acd400ef584cc8ef241f0f4b28c64e0a996d8c8d4842210a03dc" gracePeriod=30 Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.139583 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp"] Feb 27 06:15:00 crc kubenswrapper[4725]: E0227 06:15:00.140148 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c0abcb-fb62-4f62-b73e-a27620de9add" containerName="oc" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.140162 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c0abcb-fb62-4f62-b73e-a27620de9add" containerName="oc" Feb 27 06:15:00 crc kubenswrapper[4725]: E0227 06:15:00.140180 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234512e0-3471-4bd8-b783-6df7b63f2cfe" containerName="oc" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.140187 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="234512e0-3471-4bd8-b783-6df7b63f2cfe" containerName="oc" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.140325 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="234512e0-3471-4bd8-b783-6df7b63f2cfe" containerName="oc" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.140343 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c0abcb-fb62-4f62-b73e-a27620de9add" containerName="oc" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.140828 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.143439 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.148707 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp"] Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.162763 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.287503 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c966250-ab30-4713-ba4f-c19bd653309d-config-volume\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.287562 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm9lb\" (UniqueName: \"kubernetes.io/projected/5c966250-ab30-4713-ba4f-c19bd653309d-kube-api-access-bm9lb\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.287638 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c966250-ab30-4713-ba4f-c19bd653309d-secret-volume\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.389215 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c966250-ab30-4713-ba4f-c19bd653309d-config-volume\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.389261 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm9lb\" (UniqueName: \"kubernetes.io/projected/5c966250-ab30-4713-ba4f-c19bd653309d-kube-api-access-bm9lb\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.389330 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c966250-ab30-4713-ba4f-c19bd653309d-secret-volume\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.391252 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c966250-ab30-4713-ba4f-c19bd653309d-config-volume\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.409226 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c966250-ab30-4713-ba4f-c19bd653309d-secret-volume\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.409356 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm9lb\" (UniqueName: \"kubernetes.io/projected/5c966250-ab30-4713-ba4f-c19bd653309d-kube-api-access-bm9lb\") pod \"collect-profiles-29536215-v2ktp\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.444343 4725 generic.go:334] "Generic (PLEG): container finished" podID="d95915eb-bfa1-4ee1-9aec-3009a4344f7b" containerID="63a8ff098c0cf49e2eb59db92e6b024fa7880c2cfb76b1c8437bd5fe8dba4809" exitCode=0 Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.444442 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" event={"ID":"d95915eb-bfa1-4ee1-9aec-3009a4344f7b","Type":"ContainerDied","Data":"63a8ff098c0cf49e2eb59db92e6b024fa7880c2cfb76b1c8437bd5fe8dba4809"} Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.449709 4725 generic.go:334] "Generic (PLEG): container finished" podID="4afb07af-069f-4095-a8ca-10dee0be3a48" containerID="60dfa8131fa3acd400ef584cc8ef241f0f4b28c64e0a996d8c8d4842210a03dc" exitCode=0 Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.450026 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" event={"ID":"4afb07af-069f-4095-a8ca-10dee0be3a48","Type":"ContainerDied","Data":"60dfa8131fa3acd400ef584cc8ef241f0f4b28c64e0a996d8c8d4842210a03dc"} Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.472193 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.622637 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.630021 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.650341 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf"] Feb 27 06:15:00 crc kubenswrapper[4725]: E0227 06:15:00.650560 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95915eb-bfa1-4ee1-9aec-3009a4344f7b" containerName="controller-manager" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.650572 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95915eb-bfa1-4ee1-9aec-3009a4344f7b" containerName="controller-manager" Feb 27 06:15:00 crc kubenswrapper[4725]: E0227 06:15:00.650590 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afb07af-069f-4095-a8ca-10dee0be3a48" containerName="route-controller-manager" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.650596 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afb07af-069f-4095-a8ca-10dee0be3a48" containerName="route-controller-manager" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.650695 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afb07af-069f-4095-a8ca-10dee0be3a48" containerName="route-controller-manager" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.650706 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95915eb-bfa1-4ee1-9aec-3009a4344f7b" containerName="controller-manager" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.651037 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.664735 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf"] Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692596 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-serving-cert\") pod \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692653 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-config\") pod \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692681 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-proxy-ca-bundles\") pod \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692729 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vssh\" (UniqueName: \"kubernetes.io/projected/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-kube-api-access-4vssh\") pod \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692776 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-client-ca\") pod \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\" (UID: \"d95915eb-bfa1-4ee1-9aec-3009a4344f7b\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692796 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4afb07af-069f-4095-a8ca-10dee0be3a48-serving-cert\") pod \"4afb07af-069f-4095-a8ca-10dee0be3a48\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692825 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxkjn\" (UniqueName: \"kubernetes.io/projected/4afb07af-069f-4095-a8ca-10dee0be3a48-kube-api-access-lxkjn\") pod \"4afb07af-069f-4095-a8ca-10dee0be3a48\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692845 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-config\") pod \"4afb07af-069f-4095-a8ca-10dee0be3a48\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.692889 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-client-ca\") pod \"4afb07af-069f-4095-a8ca-10dee0be3a48\" (UID: \"4afb07af-069f-4095-a8ca-10dee0be3a48\") " Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.693701 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-client-ca" (OuterVolumeSpecName: "client-ca") pod "4afb07af-069f-4095-a8ca-10dee0be3a48" (UID: "4afb07af-069f-4095-a8ca-10dee0be3a48"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.694205 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-config" (OuterVolumeSpecName: "config") pod "d95915eb-bfa1-4ee1-9aec-3009a4344f7b" (UID: "d95915eb-bfa1-4ee1-9aec-3009a4344f7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.694671 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d95915eb-bfa1-4ee1-9aec-3009a4344f7b" (UID: "d95915eb-bfa1-4ee1-9aec-3009a4344f7b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.696877 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.696922 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.696933 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.697831 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "d95915eb-bfa1-4ee1-9aec-3009a4344f7b" (UID: "d95915eb-bfa1-4ee1-9aec-3009a4344f7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.698063 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-config" (OuterVolumeSpecName: "config") pod "4afb07af-069f-4095-a8ca-10dee0be3a48" (UID: "4afb07af-069f-4095-a8ca-10dee0be3a48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.698097 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d95915eb-bfa1-4ee1-9aec-3009a4344f7b" (UID: "d95915eb-bfa1-4ee1-9aec-3009a4344f7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.700818 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-kube-api-access-4vssh" (OuterVolumeSpecName: "kube-api-access-4vssh") pod "d95915eb-bfa1-4ee1-9aec-3009a4344f7b" (UID: "d95915eb-bfa1-4ee1-9aec-3009a4344f7b"). InnerVolumeSpecName "kube-api-access-4vssh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.701086 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afb07af-069f-4095-a8ca-10dee0be3a48-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4afb07af-069f-4095-a8ca-10dee0be3a48" (UID: "4afb07af-069f-4095-a8ca-10dee0be3a48"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.701611 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afb07af-069f-4095-a8ca-10dee0be3a48-kube-api-access-lxkjn" (OuterVolumeSpecName: "kube-api-access-lxkjn") pod "4afb07af-069f-4095-a8ca-10dee0be3a48" (UID: "4afb07af-069f-4095-a8ca-10dee0be3a48"). InnerVolumeSpecName "kube-api-access-lxkjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798350 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-client-ca\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798442 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gnx\" (UniqueName: \"kubernetes.io/projected/2f572895-29f6-403d-9e14-c508c33140df-kube-api-access-55gnx\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798656 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f572895-29f6-403d-9e14-c508c33140df-serving-cert\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798751 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-config\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798846 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798862 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4afb07af-069f-4095-a8ca-10dee0be3a48-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798872 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxkjn\" (UniqueName: \"kubernetes.io/projected/4afb07af-069f-4095-a8ca-10dee0be3a48-kube-api-access-lxkjn\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798881 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4afb07af-069f-4095-a8ca-10dee0be3a48-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798890 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.798900 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vssh\" (UniqueName: \"kubernetes.io/projected/d95915eb-bfa1-4ee1-9aec-3009a4344f7b-kube-api-access-4vssh\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.900457 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f572895-29f6-403d-9e14-c508c33140df-serving-cert\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.900571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-config\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.900689 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-client-ca\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.900740 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55gnx\" (UniqueName: \"kubernetes.io/projected/2f572895-29f6-403d-9e14-c508c33140df-kube-api-access-55gnx\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.901806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-client-ca\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.902873 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-config\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.906252 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f572895-29f6-403d-9e14-c508c33140df-serving-cert\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.920502 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp"] Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.921453 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55gnx\" (UniqueName: \"kubernetes.io/projected/2f572895-29f6-403d-9e14-c508c33140df-kube-api-access-55gnx\") pod \"route-controller-manager-55bfc87c88-qmgkf\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:00 crc kubenswrapper[4725]: W0227 06:15:00.926438 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c966250_ab30_4713_ba4f_c19bd653309d.slice/crio-713b34286a135a00dffb146e158de05237ea8a0ba9907d5a179d325ae37f2aad WatchSource:0}: Error finding container 713b34286a135a00dffb146e158de05237ea8a0ba9907d5a179d325ae37f2aad: Status 404 returned error can't find the container with id 713b34286a135a00dffb146e158de05237ea8a0ba9907d5a179d325ae37f2aad Feb 27 06:15:00 crc kubenswrapper[4725]: I0227 06:15:00.978723 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.203065 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf"] Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.459973 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" event={"ID":"2f572895-29f6-403d-9e14-c508c33140df","Type":"ContainerStarted","Data":"2b48832c92343d99b27c48803bd03f82e0b945416f57b98308e111d557454a07"} Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.460448 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.460472 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" event={"ID":"2f572895-29f6-403d-9e14-c508c33140df","Type":"ContainerStarted","Data":"c158d6ae94ef5a0f73e022f4e5fb44fded3a0016b8eec79ab662bba7cbb363e8"} Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.464574 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" event={"ID":"d95915eb-bfa1-4ee1-9aec-3009a4344f7b","Type":"ContainerDied","Data":"65c307e2f9a7f04c7178900aefbdcfcd946e91a91e901a8a92998bfc472a352b"} Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.464654 4725 scope.go:117] "RemoveContainer" containerID="63a8ff098c0cf49e2eb59db92e6b024fa7880c2cfb76b1c8437bd5fe8dba4809" Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.464629 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657744c59d-zszcc" Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.473379 4725 generic.go:334] "Generic (PLEG): container finished" podID="5c966250-ab30-4713-ba4f-c19bd653309d" containerID="e9533316ded9adecbc4eaf3ca53de475bfd984f1ec3585172bd6c81b05b73042" exitCode=0 Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.473438 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" event={"ID":"5c966250-ab30-4713-ba4f-c19bd653309d","Type":"ContainerDied","Data":"e9533316ded9adecbc4eaf3ca53de475bfd984f1ec3585172bd6c81b05b73042"} Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.473474 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" event={"ID":"5c966250-ab30-4713-ba4f-c19bd653309d","Type":"ContainerStarted","Data":"713b34286a135a00dffb146e158de05237ea8a0ba9907d5a179d325ae37f2aad"} Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.474926 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" event={"ID":"4afb07af-069f-4095-a8ca-10dee0be3a48","Type":"ContainerDied","Data":"812a43b444104c638b66de1a734d606e0b59186d68b7016ce131a1f905631cc1"} Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.474990 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j" Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.490791 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" podStartSLOduration=2.490774355 podStartE2EDuration="2.490774355s" podCreationTimestamp="2026-02-27 06:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:15:01.488124519 +0000 UTC m=+279.950745108" watchObservedRunningTime="2026-02-27 06:15:01.490774355 +0000 UTC m=+279.953394924" Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.494672 4725 scope.go:117] "RemoveContainer" containerID="60dfa8131fa3acd400ef584cc8ef241f0f4b28c64e0a996d8c8d4842210a03dc" Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.531591 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-657744c59d-zszcc"] Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.538689 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-657744c59d-zszcc"] Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.547447 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j"] Feb 27 06:15:01 crc kubenswrapper[4725]: I0227 06:15:01.551350 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79668c5b4-vf95j"] Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.214466 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.265865 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afb07af-069f-4095-a8ca-10dee0be3a48" path="/var/lib/kubelet/pods/4afb07af-069f-4095-a8ca-10dee0be3a48/volumes" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.267357 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95915eb-bfa1-4ee1-9aec-3009a4344f7b" path="/var/lib/kubelet/pods/d95915eb-bfa1-4ee1-9aec-3009a4344f7b/volumes" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.554723 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.554793 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.554859 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.555607 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.555711 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72" gracePeriod=600 Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.747169 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.829117 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c966250-ab30-4713-ba4f-c19bd653309d-secret-volume\") pod \"5c966250-ab30-4713-ba4f-c19bd653309d\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.829196 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm9lb\" (UniqueName: \"kubernetes.io/projected/5c966250-ab30-4713-ba4f-c19bd653309d-kube-api-access-bm9lb\") pod \"5c966250-ab30-4713-ba4f-c19bd653309d\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.829241 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c966250-ab30-4713-ba4f-c19bd653309d-config-volume\") pod \"5c966250-ab30-4713-ba4f-c19bd653309d\" (UID: \"5c966250-ab30-4713-ba4f-c19bd653309d\") " Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.830697 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c966250-ab30-4713-ba4f-c19bd653309d-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c966250-ab30-4713-ba4f-c19bd653309d" (UID: "5c966250-ab30-4713-ba4f-c19bd653309d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.835697 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c966250-ab30-4713-ba4f-c19bd653309d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c966250-ab30-4713-ba4f-c19bd653309d" (UID: "5c966250-ab30-4713-ba4f-c19bd653309d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.836450 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c966250-ab30-4713-ba4f-c19bd653309d-kube-api-access-bm9lb" (OuterVolumeSpecName: "kube-api-access-bm9lb") pod "5c966250-ab30-4713-ba4f-c19bd653309d" (UID: "5c966250-ab30-4713-ba4f-c19bd653309d"). InnerVolumeSpecName "kube-api-access-bm9lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.930790 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c966250-ab30-4713-ba4f-c19bd653309d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.930829 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c966250-ab30-4713-ba4f-c19bd653309d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:02 crc kubenswrapper[4725]: I0227 06:15:02.930842 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm9lb\" (UniqueName: \"kubernetes.io/projected/5c966250-ab30-4713-ba4f-c19bd653309d-kube-api-access-bm9lb\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.271021 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-744f9f95b5-sqpgm"] Feb 27 06:15:03 crc kubenswrapper[4725]: E0227 06:15:03.271849 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c966250-ab30-4713-ba4f-c19bd653309d" containerName="collect-profiles" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.271872 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c966250-ab30-4713-ba4f-c19bd653309d" containerName="collect-profiles" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.272011 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c966250-ab30-4713-ba4f-c19bd653309d" containerName="collect-profiles" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.272537 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.274563 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.274981 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.275260 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.277010 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.282654 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.282894 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.286324 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-744f9f95b5-sqpgm"] Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.290549 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.336649 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-config\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.336775 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztp7w\" (UniqueName: \"kubernetes.io/projected/2bff9ef6-621e-46ce-b7b1-c62a9153c370-kube-api-access-ztp7w\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.337063 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-client-ca\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.337127 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bff9ef6-621e-46ce-b7b1-c62a9153c370-serving-cert\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.338481 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-proxy-ca-bundles\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.440052 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztp7w\" (UniqueName: \"kubernetes.io/projected/2bff9ef6-621e-46ce-b7b1-c62a9153c370-kube-api-access-ztp7w\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.440138 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-client-ca\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.440177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bff9ef6-621e-46ce-b7b1-c62a9153c370-serving-cert\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.440239 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-proxy-ca-bundles\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.440350 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-config\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.442248 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-client-ca\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.442618 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-config\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.442641 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-proxy-ca-bundles\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.450111 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bff9ef6-621e-46ce-b7b1-c62a9153c370-serving-cert\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.471137 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztp7w\" (UniqueName: \"kubernetes.io/projected/2bff9ef6-621e-46ce-b7b1-c62a9153c370-kube-api-access-ztp7w\") pod \"controller-manager-744f9f95b5-sqpgm\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.518808 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" event={"ID":"5c966250-ab30-4713-ba4f-c19bd653309d","Type":"ContainerDied","Data":"713b34286a135a00dffb146e158de05237ea8a0ba9907d5a179d325ae37f2aad"} Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.518904 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="713b34286a135a00dffb146e158de05237ea8a0ba9907d5a179d325ae37f2aad" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.518931 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.523752 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72" exitCode=0 Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.524345 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72"} Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.524420 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"f09b0a4f42ec4f4cb86c337e85961d890e6fd84143196dec401eb3a8f601acdd"} Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.594686 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.700967 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.701049 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.813053 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.852242 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.852310 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.889915 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-744f9f95b5-sqpgm"] Feb 27 06:15:03 crc kubenswrapper[4725]: I0227 06:15:03.907905 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.068020 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.068625 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.125813 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.269727 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.269768 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.370609 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.531217 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" event={"ID":"2bff9ef6-621e-46ce-b7b1-c62a9153c370","Type":"ContainerStarted","Data":"50a16f1288cb11af34d5d1b9f52aa1f0845a4017c54393c4b16510eb90288d80"} Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.531280 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" event={"ID":"2bff9ef6-621e-46ce-b7b1-c62a9153c370","Type":"ContainerStarted","Data":"3497cb1cf6a17bece941a2c5cef3e2850130099f661fd1ea7e6876cd1f848d57"} Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.559535 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" podStartSLOduration=5.559513089 podStartE2EDuration="5.559513089s" podCreationTimestamp="2026-02-27 06:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:15:04.558903122 +0000 UTC m=+283.021523691" watchObservedRunningTime="2026-02-27 06:15:04.559513089 +0000 UTC m=+283.022133658" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.577401 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.585579 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.591729 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:15:04 crc kubenswrapper[4725]: I0227 06:15:04.599051 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:15:05 crc kubenswrapper[4725]: I0227 06:15:05.542162 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:05 crc kubenswrapper[4725]: I0227 06:15:05.542868 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rj5q"] Feb 27 06:15:05 crc kubenswrapper[4725]: I0227 06:15:05.548874 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:05 crc kubenswrapper[4725]: I0227 06:15:05.738074 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:15:05 crc kubenswrapper[4725]: I0227 06:15:05.805512 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:15:06 crc kubenswrapper[4725]: I0227 06:15:06.099167 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:15:06 crc kubenswrapper[4725]: I0227 06:15:06.171987 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:15:06 crc kubenswrapper[4725]: I0227 06:15:06.541314 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j6hj6"] Feb 27 06:15:06 crc kubenswrapper[4725]: I0227 06:15:06.550659 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8rj5q" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="registry-server" containerID="cri-o://6bb935faeceb0c351a755df3aa287ed2f9a5e3e20c10cc581251e572bd996459" gracePeriod=2 Feb 27 06:15:07 crc kubenswrapper[4725]: I0227 06:15:07.322171 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:15:07 crc kubenswrapper[4725]: I0227 06:15:07.380036 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:15:07 crc kubenswrapper[4725]: I0227 06:15:07.560227 4725 generic.go:334] "Generic (PLEG): container finished" podID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerID="6bb935faeceb0c351a755df3aa287ed2f9a5e3e20c10cc581251e572bd996459" exitCode=0 Feb 27 06:15:07 crc kubenswrapper[4725]: I0227 06:15:07.560378 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rj5q" event={"ID":"3965afd0-6cf4-4ea2-86a1-ce69bb98f260","Type":"ContainerDied","Data":"6bb935faeceb0c351a755df3aa287ed2f9a5e3e20c10cc581251e572bd996459"} Feb 27 06:15:07 crc kubenswrapper[4725]: I0227 06:15:07.560949 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j6hj6" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="registry-server" containerID="cri-o://6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c" gracePeriod=2 Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.115727 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.218222 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j55ss\" (UniqueName: \"kubernetes.io/projected/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-kube-api-access-j55ss\") pod \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.218342 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-catalog-content\") pod \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.218368 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-utilities\") pod \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\" (UID: \"3965afd0-6cf4-4ea2-86a1-ce69bb98f260\") " Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.219210 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-utilities" (OuterVolumeSpecName: "utilities") pod "3965afd0-6cf4-4ea2-86a1-ce69bb98f260" (UID: "3965afd0-6cf4-4ea2-86a1-ce69bb98f260"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.223889 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-kube-api-access-j55ss" (OuterVolumeSpecName: "kube-api-access-j55ss") pod "3965afd0-6cf4-4ea2-86a1-ce69bb98f260" (UID: "3965afd0-6cf4-4ea2-86a1-ce69bb98f260"). InnerVolumeSpecName "kube-api-access-j55ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.226232 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.280385 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3965afd0-6cf4-4ea2-86a1-ce69bb98f260" (UID: "3965afd0-6cf4-4ea2-86a1-ce69bb98f260"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.319900 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2bg4\" (UniqueName: \"kubernetes.io/projected/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-kube-api-access-w2bg4\") pod \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.320049 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-catalog-content\") pod \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.320208 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-utilities\") pod \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\" (UID: \"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5\") " Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.320541 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j55ss\" (UniqueName: \"kubernetes.io/projected/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-kube-api-access-j55ss\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.320568 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.320583 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3965afd0-6cf4-4ea2-86a1-ce69bb98f260-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.321858 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-utilities" (OuterVolumeSpecName: "utilities") pod "3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" (UID: "3bfc8e5f-5a0f-4384-a2af-0817928d8ba5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.323433 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-kube-api-access-w2bg4" (OuterVolumeSpecName: "kube-api-access-w2bg4") pod "3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" (UID: "3bfc8e5f-5a0f-4384-a2af-0817928d8ba5"). InnerVolumeSpecName "kube-api-access-w2bg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.382907 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" (UID: "3bfc8e5f-5a0f-4384-a2af-0817928d8ba5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.421413 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.421462 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2bg4\" (UniqueName: \"kubernetes.io/projected/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-kube-api-access-w2bg4\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.421476 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.607591 4725 generic.go:334] "Generic (PLEG): container finished" podID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerID="6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c" exitCode=0 Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.607728 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6hj6" event={"ID":"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5","Type":"ContainerDied","Data":"6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c"} Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.607766 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j6hj6" event={"ID":"3bfc8e5f-5a0f-4384-a2af-0817928d8ba5","Type":"ContainerDied","Data":"84da428a1b6f81b9ee13ab3ebe621cd9816f9a2cd1daad7a65f0361268069b32"} Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.607789 4725 scope.go:117] "RemoveContainer" containerID="6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.607818 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j6hj6" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.611187 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rj5q" event={"ID":"3965afd0-6cf4-4ea2-86a1-ce69bb98f260","Type":"ContainerDied","Data":"6aa1725f90bf8dd86a7801592b7f59e01f359a9d614dbe2388373c587bba8d3a"} Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.611268 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rj5q" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.639765 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rj5q"] Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.644105 4725 scope.go:117] "RemoveContainer" containerID="59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.645334 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8rj5q"] Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.656966 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j6hj6"] Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.661667 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j6hj6"] Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.695457 4725 scope.go:117] "RemoveContainer" containerID="8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.717267 4725 scope.go:117] "RemoveContainer" containerID="6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c" Feb 27 06:15:08 crc kubenswrapper[4725]: E0227 06:15:08.718059 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c\": container with ID starting with 6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c not found: ID does not exist" containerID="6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.718118 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c"} err="failed to get container status \"6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c\": rpc error: code = NotFound desc = could not find container \"6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c\": container with ID starting with 6434d26f3c8e51fd2be82d2515f7f9134aa0c2b4884e133b75fb1ca4f994565c not found: ID does not exist" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.718165 4725 scope.go:117] "RemoveContainer" containerID="59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b" Feb 27 06:15:08 crc kubenswrapper[4725]: E0227 06:15:08.718559 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b\": container with ID starting with 59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b not found: ID does not exist" containerID="59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.718602 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b"} err="failed to get container status \"59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b\": rpc error: code = NotFound desc = could not find container \"59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b\": container with ID starting with 59612f2ab24b03a82d004a04ede14cad48e17cf5c2be5dec79944260c79bac1b not found: ID does not exist" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.718630 4725 scope.go:117] "RemoveContainer" containerID="8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339" Feb 27 06:15:08 crc kubenswrapper[4725]: E0227 06:15:08.718913 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339\": container with ID starting with 8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339 not found: ID does not exist" containerID="8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.718956 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339"} err="failed to get container status \"8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339\": rpc error: code = NotFound desc = could not find container \"8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339\": container with ID starting with 8a4f0ba365aa66a62f75f926a3dcf3fe0746934e3ba03a88977da68f331ff339 not found: ID does not exist" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.718983 4725 scope.go:117] "RemoveContainer" containerID="6bb935faeceb0c351a755df3aa287ed2f9a5e3e20c10cc581251e572bd996459" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.733168 4725 scope.go:117] "RemoveContainer" containerID="52f0f9b342107a6eff68fcaf1b51f97912d79cf477365a713771eac5e8110c1e" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.752610 4725 scope.go:117] "RemoveContainer" containerID="0768a8fa11129655d4b1d82f0bd5334f11ec5c75ca7feee88c082eaed04be0d3" Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.938173 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lblv9"] Feb 27 06:15:08 crc kubenswrapper[4725]: I0227 06:15:08.938993 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lblv9" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="registry-server" containerID="cri-o://85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b" gracePeriod=2 Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.418599 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.538445 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-catalog-content\") pod \"2079d9d5-1660-4e80-a909-40d68fbe3c87\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.538543 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-utilities\") pod \"2079d9d5-1660-4e80-a909-40d68fbe3c87\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.538582 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l58ng\" (UniqueName: \"kubernetes.io/projected/2079d9d5-1660-4e80-a909-40d68fbe3c87-kube-api-access-l58ng\") pod \"2079d9d5-1660-4e80-a909-40d68fbe3c87\" (UID: \"2079d9d5-1660-4e80-a909-40d68fbe3c87\") " Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.540793 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-utilities" (OuterVolumeSpecName: "utilities") pod "2079d9d5-1660-4e80-a909-40d68fbe3c87" (UID: "2079d9d5-1660-4e80-a909-40d68fbe3c87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.552458 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2079d9d5-1660-4e80-a909-40d68fbe3c87-kube-api-access-l58ng" (OuterVolumeSpecName: "kube-api-access-l58ng") pod "2079d9d5-1660-4e80-a909-40d68fbe3c87" (UID: "2079d9d5-1660-4e80-a909-40d68fbe3c87"). InnerVolumeSpecName "kube-api-access-l58ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.568993 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2079d9d5-1660-4e80-a909-40d68fbe3c87" (UID: "2079d9d5-1660-4e80-a909-40d68fbe3c87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.623220 4725 generic.go:334] "Generic (PLEG): container finished" podID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerID="85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b" exitCode=0 Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.623313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lblv9" event={"ID":"2079d9d5-1660-4e80-a909-40d68fbe3c87","Type":"ContainerDied","Data":"85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b"} Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.623343 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lblv9" event={"ID":"2079d9d5-1660-4e80-a909-40d68fbe3c87","Type":"ContainerDied","Data":"8bebb9102acb5332391ae798dc3cd2e049ad5636f5d37641043fe8cbc21b856d"} Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.623367 4725 scope.go:117] "RemoveContainer" containerID="85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.623487 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lblv9" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.640619 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.640660 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2079d9d5-1660-4e80-a909-40d68fbe3c87-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.640676 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l58ng\" (UniqueName: \"kubernetes.io/projected/2079d9d5-1660-4e80-a909-40d68fbe3c87-kube-api-access-l58ng\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.645992 4725 scope.go:117] "RemoveContainer" containerID="10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.663866 4725 scope.go:117] "RemoveContainer" containerID="d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.667847 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lblv9"] Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.677238 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lblv9"] Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.692256 4725 scope.go:117] "RemoveContainer" containerID="85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b" Feb 27 06:15:09 crc kubenswrapper[4725]: E0227 06:15:09.692997 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b\": container with ID starting with 85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b not found: ID does not exist" containerID="85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.693039 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b"} err="failed to get container status \"85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b\": rpc error: code = NotFound desc = could not find container \"85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b\": container with ID starting with 85330bc234950beb6d4bfb03625c35630c89c2561aa1d7176a6dc689c6c0974b not found: ID does not exist" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.693073 4725 scope.go:117] "RemoveContainer" containerID="10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9" Feb 27 06:15:09 crc kubenswrapper[4725]: E0227 06:15:09.693590 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9\": container with ID starting with 10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9 not found: ID does not exist" containerID="10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.693629 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9"} err="failed to get container status \"10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9\": rpc error: code = NotFound desc = could not find container \"10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9\": container with ID starting with 10e8a8b03ddcb073fdbf727163b86cbc4847a7c9356c14475276ead0618139f9 not found: ID does not exist" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.693660 4725 scope.go:117] "RemoveContainer" containerID="d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e" Feb 27 06:15:09 crc kubenswrapper[4725]: E0227 06:15:09.694131 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e\": container with ID starting with d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e not found: ID does not exist" containerID="d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e" Feb 27 06:15:09 crc kubenswrapper[4725]: I0227 06:15:09.694154 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e"} err="failed to get container status \"d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e\": rpc error: code = NotFound desc = could not find container \"d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e\": container with ID starting with d2ea31b0c0ac88abc15a8ca714f629fa608d1087e732754f2a35bcbf5af1f50e not found: ID does not exist" Feb 27 06:15:10 crc kubenswrapper[4725]: I0227 06:15:10.259874 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" path="/var/lib/kubelet/pods/2079d9d5-1660-4e80-a909-40d68fbe3c87/volumes" Feb 27 06:15:10 crc kubenswrapper[4725]: I0227 06:15:10.260710 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" path="/var/lib/kubelet/pods/3965afd0-6cf4-4ea2-86a1-ce69bb98f260/volumes" Feb 27 06:15:10 crc kubenswrapper[4725]: I0227 06:15:10.261480 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" path="/var/lib/kubelet/pods/3bfc8e5f-5a0f-4384-a2af-0817928d8ba5/volumes" Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.336552 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7q5w4"] Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.337071 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7q5w4" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="registry-server" containerID="cri-o://86dc3f795e4dd01fe17afb556a8f9591d30053b756fc0a6768d252625c8ddb3c" gracePeriod=2 Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.654105 4725 generic.go:334] "Generic (PLEG): container finished" podID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerID="86dc3f795e4dd01fe17afb556a8f9591d30053b756fc0a6768d252625c8ddb3c" exitCode=0 Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.654155 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7q5w4" event={"ID":"c54a518b-2ef3-4edc-9148-80dd4485fc90","Type":"ContainerDied","Data":"86dc3f795e4dd01fe17afb556a8f9591d30053b756fc0a6768d252625c8ddb3c"} Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.882613 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.976528 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv9fh\" (UniqueName: \"kubernetes.io/projected/c54a518b-2ef3-4edc-9148-80dd4485fc90-kube-api-access-jv9fh\") pod \"c54a518b-2ef3-4edc-9148-80dd4485fc90\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.976594 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-catalog-content\") pod \"c54a518b-2ef3-4edc-9148-80dd4485fc90\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.976656 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-utilities\") pod \"c54a518b-2ef3-4edc-9148-80dd4485fc90\" (UID: \"c54a518b-2ef3-4edc-9148-80dd4485fc90\") " Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.977796 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-utilities" (OuterVolumeSpecName: "utilities") pod "c54a518b-2ef3-4edc-9148-80dd4485fc90" (UID: "c54a518b-2ef3-4edc-9148-80dd4485fc90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:11 crc kubenswrapper[4725]: I0227 06:15:11.986006 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54a518b-2ef3-4edc-9148-80dd4485fc90-kube-api-access-jv9fh" (OuterVolumeSpecName: "kube-api-access-jv9fh") pod "c54a518b-2ef3-4edc-9148-80dd4485fc90" (UID: "c54a518b-2ef3-4edc-9148-80dd4485fc90"). InnerVolumeSpecName "kube-api-access-jv9fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.078064 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv9fh\" (UniqueName: \"kubernetes.io/projected/c54a518b-2ef3-4edc-9148-80dd4485fc90-kube-api-access-jv9fh\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.078118 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.154165 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c54a518b-2ef3-4edc-9148-80dd4485fc90" (UID: "c54a518b-2ef3-4edc-9148-80dd4485fc90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.179480 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54a518b-2ef3-4edc-9148-80dd4485fc90-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.666050 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7q5w4" event={"ID":"c54a518b-2ef3-4edc-9148-80dd4485fc90","Type":"ContainerDied","Data":"555cbb8f753487898528303c8b1087910a511d4e05ac466869c87c8d69dffeb4"} Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.666143 4725 scope.go:117] "RemoveContainer" containerID="86dc3f795e4dd01fe17afb556a8f9591d30053b756fc0a6768d252625c8ddb3c" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.667766 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7q5w4" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.700474 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7q5w4"] Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.705681 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7q5w4"] Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.706517 4725 scope.go:117] "RemoveContainer" containerID="432e3bb2fff995c298e3e5f73e6564e1b09da1e719b77e2fe017b5193fffa936" Feb 27 06:15:12 crc kubenswrapper[4725]: I0227 06:15:12.732320 4725 scope.go:117] "RemoveContainer" containerID="dc986bc006ee77e08888a5e21c7000e3eb3ea695456447930409a40d6341c895" Feb 27 06:15:14 crc kubenswrapper[4725]: I0227 06:15:14.265621 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" path="/var/lib/kubelet/pods/c54a518b-2ef3-4edc-9148-80dd4485fc90/volumes" Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.331042 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-744f9f95b5-sqpgm"] Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.331696 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" podUID="2bff9ef6-621e-46ce-b7b1-c62a9153c370" containerName="controller-manager" containerID="cri-o://50a16f1288cb11af34d5d1b9f52aa1f0845a4017c54393c4b16510eb90288d80" gracePeriod=30 Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.428396 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf"] Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.428664 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" podUID="2f572895-29f6-403d-9e14-c508c33140df" containerName="route-controller-manager" containerID="cri-o://2b48832c92343d99b27c48803bd03f82e0b945416f57b98308e111d557454a07" gracePeriod=30 Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.726781 4725 generic.go:334] "Generic (PLEG): container finished" podID="2f572895-29f6-403d-9e14-c508c33140df" containerID="2b48832c92343d99b27c48803bd03f82e0b945416f57b98308e111d557454a07" exitCode=0 Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.727174 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" event={"ID":"2f572895-29f6-403d-9e14-c508c33140df","Type":"ContainerDied","Data":"2b48832c92343d99b27c48803bd03f82e0b945416f57b98308e111d557454a07"} Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.729215 4725 generic.go:334] "Generic (PLEG): container finished" podID="2bff9ef6-621e-46ce-b7b1-c62a9153c370" containerID="50a16f1288cb11af34d5d1b9f52aa1f0845a4017c54393c4b16510eb90288d80" exitCode=0 Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.729261 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" event={"ID":"2bff9ef6-621e-46ce-b7b1-c62a9153c370","Type":"ContainerDied","Data":"50a16f1288cb11af34d5d1b9f52aa1f0845a4017c54393c4b16510eb90288d80"} Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.901169 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:19 crc kubenswrapper[4725]: I0227 06:15:19.908818 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.000823 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-client-ca\") pod \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.000895 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55gnx\" (UniqueName: \"kubernetes.io/projected/2f572895-29f6-403d-9e14-c508c33140df-kube-api-access-55gnx\") pod \"2f572895-29f6-403d-9e14-c508c33140df\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.000927 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-client-ca\") pod \"2f572895-29f6-403d-9e14-c508c33140df\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.000945 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-config\") pod \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.000974 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bff9ef6-621e-46ce-b7b1-c62a9153c370-serving-cert\") pod \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.001018 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-config\") pod \"2f572895-29f6-403d-9e14-c508c33140df\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.001056 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f572895-29f6-403d-9e14-c508c33140df-serving-cert\") pod \"2f572895-29f6-403d-9e14-c508c33140df\" (UID: \"2f572895-29f6-403d-9e14-c508c33140df\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.001092 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-proxy-ca-bundles\") pod \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.001137 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztp7w\" (UniqueName: \"kubernetes.io/projected/2bff9ef6-621e-46ce-b7b1-c62a9153c370-kube-api-access-ztp7w\") pod \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\" (UID: \"2bff9ef6-621e-46ce-b7b1-c62a9153c370\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.002871 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-client-ca" (OuterVolumeSpecName: "client-ca") pod "2bff9ef6-621e-46ce-b7b1-c62a9153c370" (UID: "2bff9ef6-621e-46ce-b7b1-c62a9153c370"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.003424 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-config" (OuterVolumeSpecName: "config") pod "2f572895-29f6-403d-9e14-c508c33140df" (UID: "2f572895-29f6-403d-9e14-c508c33140df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.003458 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f572895-29f6-403d-9e14-c508c33140df" (UID: "2f572895-29f6-403d-9e14-c508c33140df"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.003710 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-config" (OuterVolumeSpecName: "config") pod "2bff9ef6-621e-46ce-b7b1-c62a9153c370" (UID: "2bff9ef6-621e-46ce-b7b1-c62a9153c370"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.003756 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2bff9ef6-621e-46ce-b7b1-c62a9153c370" (UID: "2bff9ef6-621e-46ce-b7b1-c62a9153c370"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.015871 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bff9ef6-621e-46ce-b7b1-c62a9153c370-kube-api-access-ztp7w" (OuterVolumeSpecName: "kube-api-access-ztp7w") pod "2bff9ef6-621e-46ce-b7b1-c62a9153c370" (UID: "2bff9ef6-621e-46ce-b7b1-c62a9153c370"). InnerVolumeSpecName "kube-api-access-ztp7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.016321 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f572895-29f6-403d-9e14-c508c33140df-kube-api-access-55gnx" (OuterVolumeSpecName: "kube-api-access-55gnx") pod "2f572895-29f6-403d-9e14-c508c33140df" (UID: "2f572895-29f6-403d-9e14-c508c33140df"). InnerVolumeSpecName "kube-api-access-55gnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.023333 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f572895-29f6-403d-9e14-c508c33140df-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f572895-29f6-403d-9e14-c508c33140df" (UID: "2f572895-29f6-403d-9e14-c508c33140df"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.027104 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bff9ef6-621e-46ce-b7b1-c62a9153c370-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2bff9ef6-621e-46ce-b7b1-c62a9153c370" (UID: "2bff9ef6-621e-46ce-b7b1-c62a9153c370"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102655 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f572895-29f6-403d-9e14-c508c33140df-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102919 4725 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102935 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztp7w\" (UniqueName: \"kubernetes.io/projected/2bff9ef6-621e-46ce-b7b1-c62a9153c370-kube-api-access-ztp7w\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102948 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102960 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55gnx\" (UniqueName: \"kubernetes.io/projected/2f572895-29f6-403d-9e14-c508c33140df-kube-api-access-55gnx\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102972 4725 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102985 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bff9ef6-621e-46ce-b7b1-c62a9153c370-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.102995 4725 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bff9ef6-621e-46ce-b7b1-c62a9153c370-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.103007 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f572895-29f6-403d-9e14-c508c33140df-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.119209 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" podUID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" containerName="oauth-openshift" containerID="cri-o://02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808" gracePeriod=15 Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.535921 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.612269 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-policies\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.612948 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613093 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h487b\" (UniqueName: \"kubernetes.io/projected/e0a18ecb-f59a-412e-b224-0bdbd115bd90-kube-api-access-h487b\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613120 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-serving-cert\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613569 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-trusted-ca-bundle\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613612 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-router-certs\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613640 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-idp-0-file-data\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613666 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-service-ca\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613685 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-ocp-branding-template\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613731 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-session\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613751 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-cliconfig\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613753 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613789 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-error\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613809 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-login\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613827 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-provider-selection\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.613893 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-dir\") pod \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\" (UID: \"e0a18ecb-f59a-412e-b224-0bdbd115bd90\") " Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.614075 4725 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.614086 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.614127 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.614271 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.615557 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.621263 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.622338 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.622851 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a18ecb-f59a-412e-b224-0bdbd115bd90-kube-api-access-h487b" (OuterVolumeSpecName: "kube-api-access-h487b") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "kube-api-access-h487b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.622939 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.623263 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.625528 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.633627 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.635625 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.636027 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e0a18ecb-f59a-412e-b224-0bdbd115bd90" (UID: "e0a18ecb-f59a-412e-b224-0bdbd115bd90"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.714933 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h487b\" (UniqueName: \"kubernetes.io/projected/e0a18ecb-f59a-412e-b224-0bdbd115bd90-kube-api-access-h487b\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.714980 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715000 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715013 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715027 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715041 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715050 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715059 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715068 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715080 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715091 4725 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0a18ecb-f59a-412e-b224-0bdbd115bd90-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.715110 4725 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0a18ecb-f59a-412e-b224-0bdbd115bd90-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.734735 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" event={"ID":"2f572895-29f6-403d-9e14-c508c33140df","Type":"ContainerDied","Data":"c158d6ae94ef5a0f73e022f4e5fb44fded3a0016b8eec79ab662bba7cbb363e8"} Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.734795 4725 scope.go:117] "RemoveContainer" containerID="2b48832c92343d99b27c48803bd03f82e0b945416f57b98308e111d557454a07" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.736115 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.736743 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" event={"ID":"2bff9ef6-621e-46ce-b7b1-c62a9153c370","Type":"ContainerDied","Data":"3497cb1cf6a17bece941a2c5cef3e2850130099f661fd1ea7e6876cd1f848d57"} Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.736780 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-744f9f95b5-sqpgm" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.738616 4725 generic.go:334] "Generic (PLEG): container finished" podID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" containerID="02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808" exitCode=0 Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.738659 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" event={"ID":"e0a18ecb-f59a-412e-b224-0bdbd115bd90","Type":"ContainerDied","Data":"02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808"} Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.738688 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" event={"ID":"e0a18ecb-f59a-412e-b224-0bdbd115bd90","Type":"ContainerDied","Data":"2df17f0d3d61a2a2c63d302472136cf3eb27da9b4982a9218bcc239304a501fa"} Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.738755 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zbhxj" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.760410 4725 scope.go:117] "RemoveContainer" containerID="50a16f1288cb11af34d5d1b9f52aa1f0845a4017c54393c4b16510eb90288d80" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.770038 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-744f9f95b5-sqpgm"] Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.773149 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-744f9f95b5-sqpgm"] Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.780505 4725 scope.go:117] "RemoveContainer" containerID="02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.783573 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zbhxj"] Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.786306 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zbhxj"] Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.798745 4725 scope.go:117] "RemoveContainer" containerID="02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808" Feb 27 06:15:20 crc kubenswrapper[4725]: E0227 06:15:20.799354 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808\": container with ID starting with 02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808 not found: ID does not exist" containerID="02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.799409 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808"} err="failed to get container status \"02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808\": rpc error: code = NotFound desc = could not find container \"02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808\": container with ID starting with 02a67a3a56ec8d9060e0f5016fc5b0bf37243b065c52644f0da86a348ca8a808 not found: ID does not exist" Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.801394 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf"] Feb 27 06:15:20 crc kubenswrapper[4725]: I0227 06:15:20.804111 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bfc87c88-qmgkf"] Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286342 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7757f4446d-7smzn"] Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286753 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" containerName="oauth-openshift" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286783 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" containerName="oauth-openshift" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286799 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286814 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286836 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286850 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286875 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286888 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286905 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286918 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286937 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286949 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286966 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.286979 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.286995 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287007 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.287026 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287038 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.287055 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f572895-29f6-403d-9e14-c508c33140df" containerName="route-controller-manager" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287068 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f572895-29f6-403d-9e14-c508c33140df" containerName="route-controller-manager" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.287085 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287101 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.287119 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bff9ef6-621e-46ce-b7b1-c62a9153c370" containerName="controller-manager" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287131 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bff9ef6-621e-46ce-b7b1-c62a9153c370" containerName="controller-manager" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.287151 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287163 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="extract-utilities" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.287180 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287193 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: E0227 06:15:21.287211 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287223 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="extract-content" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287420 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2079d9d5-1660-4e80-a909-40d68fbe3c87" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287437 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3965afd0-6cf4-4ea2-86a1-ce69bb98f260" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287455 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" containerName="oauth-openshift" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287471 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f572895-29f6-403d-9e14-c508c33140df" containerName="route-controller-manager" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287495 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bff9ef6-621e-46ce-b7b1-c62a9153c370" containerName="controller-manager" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287513 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54a518b-2ef3-4edc-9148-80dd4485fc90" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.287530 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfc8e5f-5a0f-4384-a2af-0817928d8ba5" containerName="registry-server" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.288125 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.289926 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.292229 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.292721 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.293330 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.293709 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.293936 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.296593 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf"] Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.297896 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.303986 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.303989 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.304734 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.305145 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.309671 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.315313 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.315689 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7757f4446d-7smzn"] Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.316930 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.323908 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-proxy-ca-bundles\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.323974 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-serving-cert\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.324052 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-config\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.325746 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkls\" (UniqueName: \"kubernetes.io/projected/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-kube-api-access-zfkls\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.325872 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-client-ca\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.332458 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf"] Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.426999 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vshnd\" (UniqueName: \"kubernetes.io/projected/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-kube-api-access-vshnd\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427109 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-proxy-ca-bundles\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427160 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-config\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427204 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-serving-cert\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427257 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-config\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427349 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-serving-cert\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427407 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkls\" (UniqueName: \"kubernetes.io/projected/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-kube-api-access-zfkls\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-client-ca\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.427479 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-client-ca\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.429281 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-proxy-ca-bundles\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.429628 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-client-ca\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.429791 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-config\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.433830 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-serving-cert\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.458761 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkls\" (UniqueName: \"kubernetes.io/projected/03e7ec9e-11e1-47b8-9fbf-58abc3365c1a-kube-api-access-zfkls\") pod \"controller-manager-7757f4446d-7smzn\" (UID: \"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a\") " pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.529464 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-client-ca\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.529578 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vshnd\" (UniqueName: \"kubernetes.io/projected/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-kube-api-access-vshnd\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.529668 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-config\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.529740 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-serving-cert\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.531859 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-config\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.532003 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-client-ca\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.535096 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-serving-cert\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.559829 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vshnd\" (UniqueName: \"kubernetes.io/projected/b58a8e2c-2e60-4ebc-8baf-2707da8948e8-kube-api-access-vshnd\") pod \"route-controller-manager-9f4f5b69b-6mqnf\" (UID: \"b58a8e2c-2e60-4ebc-8baf-2707da8948e8\") " pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.621497 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.647429 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:21 crc kubenswrapper[4725]: I0227 06:15:21.983080 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf"] Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.109544 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7757f4446d-7smzn"] Feb 27 06:15:22 crc kubenswrapper[4725]: W0227 06:15:22.115554 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e7ec9e_11e1_47b8_9fbf_58abc3365c1a.slice/crio-a10946f297b2dac32b65fc270caeaa541c2ea0b61dbd93fe4ad2df7d87c92497 WatchSource:0}: Error finding container a10946f297b2dac32b65fc270caeaa541c2ea0b61dbd93fe4ad2df7d87c92497: Status 404 returned error can't find the container with id a10946f297b2dac32b65fc270caeaa541c2ea0b61dbd93fe4ad2df7d87c92497 Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.265107 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bff9ef6-621e-46ce-b7b1-c62a9153c370" path="/var/lib/kubelet/pods/2bff9ef6-621e-46ce-b7b1-c62a9153c370/volumes" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.265630 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f572895-29f6-403d-9e14-c508c33140df" path="/var/lib/kubelet/pods/2f572895-29f6-403d-9e14-c508c33140df/volumes" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.266116 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a18ecb-f59a-412e-b224-0bdbd115bd90" path="/var/lib/kubelet/pods/e0a18ecb-f59a-412e-b224-0bdbd115bd90/volumes" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.769755 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" event={"ID":"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a","Type":"ContainerStarted","Data":"9764801cb28182d0d0861ea33acc241703a40401338fe211b5e5f49bf7d7262a"} Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.771342 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.771448 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" event={"ID":"03e7ec9e-11e1-47b8-9fbf-58abc3365c1a","Type":"ContainerStarted","Data":"a10946f297b2dac32b65fc270caeaa541c2ea0b61dbd93fe4ad2df7d87c92497"} Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.771909 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" event={"ID":"b58a8e2c-2e60-4ebc-8baf-2707da8948e8","Type":"ContainerStarted","Data":"0e97fe62d826931dbfd17aca259eaf4bbe01cf1ad87df441c3c8b006bf376379"} Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.772047 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" event={"ID":"b58a8e2c-2e60-4ebc-8baf-2707da8948e8","Type":"ContainerStarted","Data":"9ddcd62a33dade75ce66c064285c4854ac4fd06be64995870d86bec916e4fbca"} Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.772258 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.777647 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.779975 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.796757 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7757f4446d-7smzn" podStartSLOduration=3.796730632 podStartE2EDuration="3.796730632s" podCreationTimestamp="2026-02-27 06:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:15:22.795996131 +0000 UTC m=+301.258616730" watchObservedRunningTime="2026-02-27 06:15:22.796730632 +0000 UTC m=+301.259351231" Feb 27 06:15:22 crc kubenswrapper[4725]: I0227 06:15:22.818117 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9f4f5b69b-6mqnf" podStartSLOduration=3.818087917 podStartE2EDuration="3.818087917s" podCreationTimestamp="2026-02-27 06:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:15:22.814482993 +0000 UTC m=+301.277103612" watchObservedRunningTime="2026-02-27 06:15:22.818087917 +0000 UTC m=+301.280708516" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.076444 4725 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.077883 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4" gracePeriod=15 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.077953 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06" gracePeriod=15 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.078024 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd" gracePeriod=15 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.078067 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb" gracePeriod=15 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.078165 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52" gracePeriod=15 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.078867 4725 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079186 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079207 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079232 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079244 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079264 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079279 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079332 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079347 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079366 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079382 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079408 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079425 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079445 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079460 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079479 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079495 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079520 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079535 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.079554 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079568 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079783 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079806 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079829 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079855 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079876 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079897 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079913 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.079932 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.080417 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.082749 4725 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.084091 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.089508 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190173 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190280 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190391 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190430 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190585 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190688 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190758 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.190837 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.240553 4725 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292607 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292692 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292729 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292754 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292801 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292809 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292845 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.292928 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293020 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293099 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293221 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293243 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293283 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293326 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293367 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.293838 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.541549 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:25 crc kubenswrapper[4725]: E0227 06:15:25.579050 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189805e2eda725e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:15:25.578192359 +0000 UTC m=+304.040812938,LastTimestamp:2026-02-27 06:15:25.578192359 +0000 UTC m=+304.040812938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.794250 4725 generic.go:334] "Generic (PLEG): container finished" podID="16976e74-aa71-40ae-a441-adfc92420ac5" containerID="ab5fd7ef0320fdefbc5ef0ac447d9b8b6f47d750a8c6ad340364f16cad65a4c7" exitCode=0 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.794348 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"16976e74-aa71-40ae-a441-adfc92420ac5","Type":"ContainerDied","Data":"ab5fd7ef0320fdefbc5ef0ac447d9b8b6f47d750a8c6ad340364f16cad65a4c7"} Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.795415 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3624e41afc1da5387eb6a92145e9408a37f2348f8f62c601e5bb876fd428e6f3"} Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.795614 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.800302 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.801964 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.803171 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06" exitCode=0 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.803247 4725 scope.go:117] "RemoveContainer" containerID="9e5ebffd8911627c02548f97e0e4af1fde7d478d10ca9bc3a9642593c1922ef4" Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.803198 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52" exitCode=0 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.803486 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd" exitCode=0 Feb 27 06:15:25 crc kubenswrapper[4725]: I0227 06:15:25.803497 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb" exitCode=2 Feb 27 06:15:26 crc kubenswrapper[4725]: I0227 06:15:26.817064 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 06:15:26 crc kubenswrapper[4725]: I0227 06:15:26.821233 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488"} Feb 27 06:15:26 crc kubenswrapper[4725]: E0227 06:15:26.822569 4725 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:26 crc kubenswrapper[4725]: I0227 06:15:26.823587 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.273905 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.277674 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.322426 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16976e74-aa71-40ae-a441-adfc92420ac5-kube-api-access\") pod \"16976e74-aa71-40ae-a441-adfc92420ac5\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.322523 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-var-lock\") pod \"16976e74-aa71-40ae-a441-adfc92420ac5\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.322620 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-kubelet-dir\") pod \"16976e74-aa71-40ae-a441-adfc92420ac5\" (UID: \"16976e74-aa71-40ae-a441-adfc92420ac5\") " Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.322767 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-var-lock" (OuterVolumeSpecName: "var-lock") pod "16976e74-aa71-40ae-a441-adfc92420ac5" (UID: "16976e74-aa71-40ae-a441-adfc92420ac5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.322916 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "16976e74-aa71-40ae-a441-adfc92420ac5" (UID: "16976e74-aa71-40ae-a441-adfc92420ac5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.322966 4725 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.337614 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16976e74-aa71-40ae-a441-adfc92420ac5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "16976e74-aa71-40ae-a441-adfc92420ac5" (UID: "16976e74-aa71-40ae-a441-adfc92420ac5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.424193 4725 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16976e74-aa71-40ae-a441-adfc92420ac5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.424215 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16976e74-aa71-40ae-a441-adfc92420ac5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.507127 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.507982 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.508377 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.508530 4725 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.626058 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.626163 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.626337 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.626604 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.626636 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.626651 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.728272 4725 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.728344 4725 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.728364 4725 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.833000 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.834953 4725 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4" exitCode=0 Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.835096 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.835134 4725 scope.go:117] "RemoveContainer" containerID="0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.839143 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"16976e74-aa71-40ae-a441-adfc92420ac5","Type":"ContainerDied","Data":"f1acff55b97fd11bf3d6836d0a1a9a7a07d7473d03b9cdfd70665f351b41108d"} Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.839199 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1acff55b97fd11bf3d6836d0a1a9a7a07d7473d03b9cdfd70665f351b41108d" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.839197 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 06:15:27 crc kubenswrapper[4725]: E0227 06:15:27.841644 4725 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.863077 4725 scope.go:117] "RemoveContainer" containerID="20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.868106 4725 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.868840 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.869525 4725 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.870104 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.892160 4725 scope.go:117] "RemoveContainer" containerID="223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.917003 4725 scope.go:117] "RemoveContainer" containerID="633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.937909 4725 scope.go:117] "RemoveContainer" containerID="630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.957828 4725 scope.go:117] "RemoveContainer" containerID="d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.984495 4725 scope.go:117] "RemoveContainer" containerID="0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06" Feb 27 06:15:27 crc kubenswrapper[4725]: E0227 06:15:27.985092 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\": container with ID starting with 0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06 not found: ID does not exist" containerID="0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.985181 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06"} err="failed to get container status \"0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\": rpc error: code = NotFound desc = could not find container \"0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06\": container with ID starting with 0da959734f0e2d085c31aad1b013862dc5999eefa2b4a48dc8b08eb16c78be06 not found: ID does not exist" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.985234 4725 scope.go:117] "RemoveContainer" containerID="20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52" Feb 27 06:15:27 crc kubenswrapper[4725]: E0227 06:15:27.986690 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\": container with ID starting with 20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52 not found: ID does not exist" containerID="20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.986758 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52"} err="failed to get container status \"20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\": rpc error: code = NotFound desc = could not find container \"20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52\": container with ID starting with 20ed0146ba0ec2710049d037617ee874eb9d97b203706537d336266f23871c52 not found: ID does not exist" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.986808 4725 scope.go:117] "RemoveContainer" containerID="223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd" Feb 27 06:15:27 crc kubenswrapper[4725]: E0227 06:15:27.987342 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\": container with ID starting with 223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd not found: ID does not exist" containerID="223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.987393 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd"} err="failed to get container status \"223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\": rpc error: code = NotFound desc = could not find container \"223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd\": container with ID starting with 223336ed12b3e0d742a7034c6530ebf9735edf85711eab7d4149eac059648efd not found: ID does not exist" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.987425 4725 scope.go:117] "RemoveContainer" containerID="633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb" Feb 27 06:15:27 crc kubenswrapper[4725]: E0227 06:15:27.987862 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\": container with ID starting with 633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb not found: ID does not exist" containerID="633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.987935 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb"} err="failed to get container status \"633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\": rpc error: code = NotFound desc = could not find container \"633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb\": container with ID starting with 633179fa9d6afd408c9d52e542d8a958649aaa36b643563e9a0d58a9253b0dbb not found: ID does not exist" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.987987 4725 scope.go:117] "RemoveContainer" containerID="630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4" Feb 27 06:15:27 crc kubenswrapper[4725]: E0227 06:15:27.988559 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\": container with ID starting with 630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4 not found: ID does not exist" containerID="630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.988603 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4"} err="failed to get container status \"630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\": rpc error: code = NotFound desc = could not find container \"630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4\": container with ID starting with 630585c2e25d5de37d607bd4a624f946068c4598b7af87e0c6a7ad1849dd89f4 not found: ID does not exist" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.988632 4725 scope.go:117] "RemoveContainer" containerID="d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f" Feb 27 06:15:27 crc kubenswrapper[4725]: E0227 06:15:27.989543 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\": container with ID starting with d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f not found: ID does not exist" containerID="d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f" Feb 27 06:15:27 crc kubenswrapper[4725]: I0227 06:15:27.989641 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f"} err="failed to get container status \"d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\": rpc error: code = NotFound desc = could not find container \"d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f\": container with ID starting with d8d865fd285b19c61fa4959bede89ff21ecd094187f61a7540a89de0c8ef5e1f not found: ID does not exist" Feb 27 06:15:28 crc kubenswrapper[4725]: I0227 06:15:28.266925 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 27 06:15:29 crc kubenswrapper[4725]: E0227 06:15:29.327857 4725 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" volumeName="registry-storage" Feb 27 06:15:29 crc kubenswrapper[4725]: E0227 06:15:29.966941 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189805e2eda725e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:15:25.578192359 +0000 UTC m=+304.040812938,LastTimestamp:2026-02-27 06:15:25.578192359 +0000 UTC m=+304.040812938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:15:32 crc kubenswrapper[4725]: I0227 06:15:32.254621 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.033001 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.033563 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.034148 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.034390 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.034543 4725 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:34 crc kubenswrapper[4725]: I0227 06:15:34.034570 4725 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.034746 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.235909 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Feb 27 06:15:34 crc kubenswrapper[4725]: E0227 06:15:34.636731 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Feb 27 06:15:35 crc kubenswrapper[4725]: E0227 06:15:35.437619 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Feb 27 06:15:37 crc kubenswrapper[4725]: E0227 06:15:37.038658 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Feb 27 06:15:39 crc kubenswrapper[4725]: I0227 06:15:39.942399 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 06:15:39 crc kubenswrapper[4725]: I0227 06:15:39.943484 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 06:15:39 crc kubenswrapper[4725]: I0227 06:15:39.943563 4725 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab" exitCode=1 Feb 27 06:15:39 crc kubenswrapper[4725]: I0227 06:15:39.943632 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab"} Feb 27 06:15:39 crc kubenswrapper[4725]: I0227 06:15:39.944547 4725 scope.go:117] "RemoveContainer" containerID="643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab" Feb 27 06:15:39 crc kubenswrapper[4725]: I0227 06:15:39.945487 4725 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:39 crc kubenswrapper[4725]: I0227 06:15:39.946122 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:39 crc kubenswrapper[4725]: E0227 06:15:39.968739 4725 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189805e2eda725e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 06:15:25.578192359 +0000 UTC m=+304.040812938,LastTimestamp:2026-02-27 06:15:25.578192359 +0000 UTC m=+304.040812938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 06:15:40 crc kubenswrapper[4725]: E0227 06:15:40.240034 4725 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="6.4s" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.251180 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.252239 4725 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.252807 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.276090 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.276149 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:40 crc kubenswrapper[4725]: E0227 06:15:40.276699 4725 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.277351 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.958503 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.959617 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.959720 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a780a743d48108ddec8daf75814797a6a3bdcac68d4e3b4538263405a1f1baae"} Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.960892 4725 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.961521 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.962017 4725 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2a9f200794f435bafeb853825e0338588cd5089db6f2c8ace44f6796be663696" exitCode=0 Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.962076 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2a9f200794f435bafeb853825e0338588cd5089db6f2c8ace44f6796be663696"} Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.962130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c6887b26259b52bb372c229e1526e8eec350697d715bff79b2266de5a9e6347"} Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.962621 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.962655 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.963010 4725 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:40 crc kubenswrapper[4725]: E0227 06:15:40.963348 4725 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:40 crc kubenswrapper[4725]: I0227 06:15:40.963751 4725 status_manager.go:851] "Failed to get status for pod" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Feb 27 06:15:41 crc kubenswrapper[4725]: I0227 06:15:41.971899 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0daf5f56f8efc2e2fa26e58e9ac5ab88db02280b834450146bc5a1634acb0689"} Feb 27 06:15:41 crc kubenswrapper[4725]: I0227 06:15:41.972349 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c669c039e4b8cd3d8fa1d47c8126b7b9fb40c2888c51ea6acab069a26b9457d5"} Feb 27 06:15:41 crc kubenswrapper[4725]: I0227 06:15:41.972364 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c093549d37df3f93289283d7d1fb29efc183d36b8591076b5319060a580a5b57"} Feb 27 06:15:42 crc kubenswrapper[4725]: I0227 06:15:42.018651 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:15:42 crc kubenswrapper[4725]: I0227 06:15:42.979833 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d4d87876edf572c2e0ec1d77ab4e86f8d96bfe16080dd98339f9aa785aad08cd"} Feb 27 06:15:42 crc kubenswrapper[4725]: I0227 06:15:42.980105 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:42 crc kubenswrapper[4725]: I0227 06:15:42.980120 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc791ab1510417490771f9ffb455e5eebeda1e0f24ddf5a187f3e7217df59130"} Feb 27 06:15:42 crc kubenswrapper[4725]: I0227 06:15:42.980187 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:42 crc kubenswrapper[4725]: I0227 06:15:42.980208 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:45 crc kubenswrapper[4725]: I0227 06:15:45.278041 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:45 crc kubenswrapper[4725]: I0227 06:15:45.280250 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:45 crc kubenswrapper[4725]: I0227 06:15:45.286742 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:47 crc kubenswrapper[4725]: I0227 06:15:47.991836 4725 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:48 crc kubenswrapper[4725]: I0227 06:15:48.025172 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:48 crc kubenswrapper[4725]: I0227 06:15:48.025203 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:48 crc kubenswrapper[4725]: I0227 06:15:48.028981 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:15:48 crc kubenswrapper[4725]: I0227 06:15:48.032242 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ea9512ab-3170-4197-847a-0c59dd2210de" Feb 27 06:15:49 crc kubenswrapper[4725]: I0227 06:15:49.031151 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:49 crc kubenswrapper[4725]: I0227 06:15:49.031181 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:15:49 crc kubenswrapper[4725]: I0227 06:15:49.216448 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:15:49 crc kubenswrapper[4725]: I0227 06:15:49.216907 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 06:15:49 crc kubenswrapper[4725]: I0227 06:15:49.216994 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 06:15:52 crc kubenswrapper[4725]: I0227 06:15:52.271255 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ea9512ab-3170-4197-847a-0c59dd2210de" Feb 27 06:15:57 crc kubenswrapper[4725]: I0227 06:15:57.513346 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 06:15:57 crc kubenswrapper[4725]: I0227 06:15:57.661427 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 06:15:57 crc kubenswrapper[4725]: I0227 06:15:57.817765 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 06:15:58 crc kubenswrapper[4725]: I0227 06:15:58.504373 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 06:15:58 crc kubenswrapper[4725]: I0227 06:15:58.901383 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.006908 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.084668 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.216458 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.216521 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.370494 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.428561 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.434263 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.622042 4725 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 06:15:59 crc kubenswrapper[4725]: I0227 06:15:59.985222 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.088965 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.240313 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.254783 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.288737 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.423530 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.460136 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.493273 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.672233 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.690733 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.790143 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.889974 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 06:16:00 crc kubenswrapper[4725]: I0227 06:16:00.961624 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.157051 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.191695 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.209862 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.236066 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.296888 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.496594 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.500408 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.580431 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.611882 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.710888 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.720715 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.725530 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.732620 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.773629 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.782633 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.853123 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 06:16:01 crc kubenswrapper[4725]: I0227 06:16:01.934060 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.093486 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.138215 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.153986 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.168473 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.172517 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.176544 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.308969 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.325811 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.360936 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.391545 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.514111 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.611447 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.654218 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.716807 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.819829 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.883100 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 06:16:02 crc kubenswrapper[4725]: I0227 06:16:02.900266 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.038238 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.064013 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.092858 4725 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.100794 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.100877 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 06:16:03 crc kubenswrapper[4725]: E0227 06:16:03.101205 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" containerName="installer" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.101241 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" containerName="installer" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.101486 4725 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.101579 4725 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="20acbe1e-5472-4b81-b830-d2cb9c19f564" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.101636 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="16976e74-aa71-40ae-a441-adfc92420ac5" containerName="installer" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.102375 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.107684 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.108032 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.108222 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.108450 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.108635 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.108890 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.109251 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.109679 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.111632 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.112070 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.112658 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.113138 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.115216 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.115594 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.120715 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.125399 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.140260 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.140244177 podStartE2EDuration="15.140244177s" podCreationTimestamp="2026-02-27 06:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:16:03.140239487 +0000 UTC m=+341.602860066" watchObservedRunningTime="2026-02-27 06:16:03.140244177 +0000 UTC m=+341.602864756" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.147397 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.160384 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.194441 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.195246 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.195424 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.195561 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.195667 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.195939 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196008 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-audit-policies\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196052 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196099 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196138 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85f76afb-dbcf-444e-896d-21a0bc4c6a75-audit-dir\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196181 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbnm\" (UniqueName: \"kubernetes.io/projected/85f76afb-dbcf-444e-896d-21a0bc4c6a75-kube-api-access-pzbnm\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196235 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196317 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.196363 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.219900 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297043 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297118 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297193 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297225 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297280 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-audit-policies\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297385 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297427 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85f76afb-dbcf-444e-896d-21a0bc4c6a75-audit-dir\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297544 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbnm\" (UniqueName: \"kubernetes.io/projected/85f76afb-dbcf-444e-896d-21a0bc4c6a75-kube-api-access-pzbnm\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297587 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297627 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297669 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.297707 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.303507 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.303722 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85f76afb-dbcf-444e-896d-21a0bc4c6a75-audit-dir\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.304197 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.305072 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.306563 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.307448 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.307560 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.308255 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85f76afb-dbcf-444e-896d-21a0bc4c6a75-audit-policies\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.308550 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.311194 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.312925 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.316159 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.321524 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85f76afb-dbcf-444e-896d-21a0bc4c6a75-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.331006 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.335514 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbnm\" (UniqueName: \"kubernetes.io/projected/85f76afb-dbcf-444e-896d-21a0bc4c6a75-kube-api-access-pzbnm\") pod \"oauth-openshift-5584c6b7fb-qd2pc\" (UID: \"85f76afb-dbcf-444e-896d-21a0bc4c6a75\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.387572 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.429120 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.443987 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.446624 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.623278 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.668905 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.671264 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.686329 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.704792 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.775561 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.808811 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.951542 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 06:16:03 crc kubenswrapper[4725]: I0227 06:16:03.994239 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.060865 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.070771 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.237981 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.311142 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.489279 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.496832 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.504667 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.520745 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.531847 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.558559 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.567513 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.771388 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.792455 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.802956 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.806914 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.920615 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 06:16:04 crc kubenswrapper[4725]: I0227 06:16:04.967438 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.263715 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc"] Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.269984 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.308266 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.331426 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.355366 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.391640 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.441710 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.572042 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.628405 4725 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.631571 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.723927 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc"] Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.737970 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.793923 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.936681 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.955758 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.959986 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.990835 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 06:16:05 crc kubenswrapper[4725]: I0227 06:16:05.992527 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.130491 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.162384 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" event={"ID":"85f76afb-dbcf-444e-896d-21a0bc4c6a75","Type":"ContainerStarted","Data":"b3946bfc5d3087c28074812b746f42d4c48021762043247b65d023b911aa8d8f"} Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.162455 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" event={"ID":"85f76afb-dbcf-444e-896d-21a0bc4c6a75","Type":"ContainerStarted","Data":"656bf775450e65428253a814f77f3798942e1ed3d437fbf4a7ea87d951631eca"} Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.162912 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.168169 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.178760 4725 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.184592 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" podStartSLOduration=71.184575601 podStartE2EDuration="1m11.184575601s" podCreationTimestamp="2026-02-27 06:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:16:06.183536083 +0000 UTC m=+344.646156682" watchObservedRunningTime="2026-02-27 06:16:06.184575601 +0000 UTC m=+344.647196170" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.236689 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.257838 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.356200 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.379385 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.430046 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.432274 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.466146 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.489069 4725 patch_prober.go:28] interesting pod/oauth-openshift-5584c6b7fb-qd2pc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.70:6443/healthz\": read tcp 10.217.0.2:44882->10.217.0.70:6443: read: connection reset by peer" start-of-body= Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.489135 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" podUID="85f76afb-dbcf-444e-896d-21a0bc4c6a75" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.70:6443/healthz\": read tcp 10.217.0.2:44882->10.217.0.70:6443: read: connection reset by peer" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.533908 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.745760 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.754381 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.757575 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.777440 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.806105 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.841942 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.893142 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.922405 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.948059 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 06:16:06 crc kubenswrapper[4725]: I0227 06:16:06.958899 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.013342 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.023534 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.065016 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.173971 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5584c6b7fb-qd2pc_85f76afb-dbcf-444e-896d-21a0bc4c6a75/oauth-openshift/0.log" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.174078 4725 generic.go:334] "Generic (PLEG): container finished" podID="85f76afb-dbcf-444e-896d-21a0bc4c6a75" containerID="b3946bfc5d3087c28074812b746f42d4c48021762043247b65d023b911aa8d8f" exitCode=255 Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.174131 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" event={"ID":"85f76afb-dbcf-444e-896d-21a0bc4c6a75","Type":"ContainerDied","Data":"b3946bfc5d3087c28074812b746f42d4c48021762043247b65d023b911aa8d8f"} Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.174694 4725 scope.go:117] "RemoveContainer" containerID="b3946bfc5d3087c28074812b746f42d4c48021762043247b65d023b911aa8d8f" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.271589 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.304961 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.305932 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.332805 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.339163 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.376669 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.617867 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.672877 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.686259 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.695866 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 06:16:07 crc kubenswrapper[4725]: I0227 06:16:07.715346 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.024795 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.036189 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.037706 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.107502 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.130189 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.161599 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.168643 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.184559 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5584c6b7fb-qd2pc_85f76afb-dbcf-444e-896d-21a0bc4c6a75/oauth-openshift/0.log" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.184633 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" event={"ID":"85f76afb-dbcf-444e-896d-21a0bc4c6a75","Type":"ContainerStarted","Data":"9cdd7f2db925bf975f091662e4975fa834e0d08bc4cad64453cd271d8717cb5d"} Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.185108 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.191632 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5584c6b7fb-qd2pc" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.263082 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.324748 4725 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.424818 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.558173 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.570936 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 06:16:08 crc kubenswrapper[4725]: I0227 06:16:08.689173 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.058086 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.064347 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.129439 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.135036 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.196830 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.216462 4725 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.216535 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.216600 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.217584 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"a780a743d48108ddec8daf75814797a6a3bdcac68d4e3b4538263405a1f1baae"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.217786 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://a780a743d48108ddec8daf75814797a6a3bdcac68d4e3b4538263405a1f1baae" gracePeriod=30 Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.228330 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.234058 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.416992 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.560579 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.633137 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.796888 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.860696 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.875691 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.880893 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.965445 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 06:16:09 crc kubenswrapper[4725]: I0227 06:16:09.986211 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.019573 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.029038 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.147668 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.212610 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.229133 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.268478 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.270040 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.280202 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.522405 4725 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.522755 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488" gracePeriod=5 Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.523422 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.575788 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.631575 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.649168 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.672941 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.781204 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.823326 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.834966 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.899542 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.911501 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.914831 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 06:16:10 crc kubenswrapper[4725]: I0227 06:16:10.952677 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.001317 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.018806 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.034553 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.104477 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.115069 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.283395 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.291487 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.292153 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.356541 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.378265 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.398540 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.777749 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.781335 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 06:16:11 crc kubenswrapper[4725]: I0227 06:16:11.788394 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.019031 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.025588 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.048522 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.141175 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.141232 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.208477 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.384672 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.454238 4725 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.468251 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.497151 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.509396 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.800933 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.816990 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 06:16:12 crc kubenswrapper[4725]: I0227 06:16:12.976726 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 06:16:13 crc kubenswrapper[4725]: I0227 06:16:13.020471 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 06:16:13 crc kubenswrapper[4725]: I0227 06:16:13.066262 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 06:16:13 crc kubenswrapper[4725]: I0227 06:16:13.450073 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 06:16:13 crc kubenswrapper[4725]: I0227 06:16:13.556475 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 06:16:13 crc kubenswrapper[4725]: I0227 06:16:13.614122 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 06:16:13 crc kubenswrapper[4725]: I0227 06:16:13.801427 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 06:16:13 crc kubenswrapper[4725]: I0227 06:16:13.953723 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 06:16:14 crc kubenswrapper[4725]: I0227 06:16:14.117720 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 06:16:14 crc kubenswrapper[4725]: I0227 06:16:14.153841 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 06:16:14 crc kubenswrapper[4725]: I0227 06:16:14.377689 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 06:16:14 crc kubenswrapper[4725]: I0227 06:16:14.393873 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 06:16:14 crc kubenswrapper[4725]: I0227 06:16:14.488021 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 06:16:15 crc kubenswrapper[4725]: E0227 06:16:15.629753 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-conmon-df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488.scope\": RecentStats: unable to find data in memory cache]" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.663161 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.663336 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788042 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788102 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788176 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788246 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788357 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788385 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788441 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788433 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.788485 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.789224 4725 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.789275 4725 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.789361 4725 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.789394 4725 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.796924 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:16:15 crc kubenswrapper[4725]: I0227 06:16:15.891498 4725 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 06:16:16 crc kubenswrapper[4725]: I0227 06:16:16.241859 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 06:16:16 crc kubenswrapper[4725]: I0227 06:16:16.241974 4725 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488" exitCode=137 Feb 27 06:16:16 crc kubenswrapper[4725]: I0227 06:16:16.242037 4725 scope.go:117] "RemoveContainer" containerID="df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488" Feb 27 06:16:16 crc kubenswrapper[4725]: I0227 06:16:16.242076 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 06:16:16 crc kubenswrapper[4725]: I0227 06:16:16.263470 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 27 06:16:16 crc kubenswrapper[4725]: I0227 06:16:16.271717 4725 scope.go:117] "RemoveContainer" containerID="df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488" Feb 27 06:16:16 crc kubenswrapper[4725]: E0227 06:16:16.272211 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488\": container with ID starting with df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488 not found: ID does not exist" containerID="df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488" Feb 27 06:16:16 crc kubenswrapper[4725]: I0227 06:16:16.272263 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488"} err="failed to get container status \"df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488\": rpc error: code = NotFound desc = could not find container \"df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488\": container with ID starting with df3468f8ad2d043a7955dac47ee1d849b457147cc749ec4b1825b8428180e488 not found: ID does not exist" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.301235 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536216-tw6gq"] Feb 27 06:16:32 crc kubenswrapper[4725]: E0227 06:16:32.302091 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.302112 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.302280 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.302852 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.304555 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.304749 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.304775 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.339648 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536216-tw6gq"] Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.427223 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9g9\" (UniqueName: \"kubernetes.io/projected/d13fdc97-d080-45c4-a03e-12e51a0c85bf-kube-api-access-rn9g9\") pod \"auto-csr-approver-29536216-tw6gq\" (UID: \"d13fdc97-d080-45c4-a03e-12e51a0c85bf\") " pod="openshift-infra/auto-csr-approver-29536216-tw6gq" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.528163 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9g9\" (UniqueName: \"kubernetes.io/projected/d13fdc97-d080-45c4-a03e-12e51a0c85bf-kube-api-access-rn9g9\") pod \"auto-csr-approver-29536216-tw6gq\" (UID: \"d13fdc97-d080-45c4-a03e-12e51a0c85bf\") " pod="openshift-infra/auto-csr-approver-29536216-tw6gq" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.567621 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9g9\" (UniqueName: \"kubernetes.io/projected/d13fdc97-d080-45c4-a03e-12e51a0c85bf-kube-api-access-rn9g9\") pod \"auto-csr-approver-29536216-tw6gq\" (UID: \"d13fdc97-d080-45c4-a03e-12e51a0c85bf\") " pod="openshift-infra/auto-csr-approver-29536216-tw6gq" Feb 27 06:16:32 crc kubenswrapper[4725]: I0227 06:16:32.617747 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" Feb 27 06:16:33 crc kubenswrapper[4725]: I0227 06:16:33.037109 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536216-tw6gq"] Feb 27 06:16:33 crc kubenswrapper[4725]: I0227 06:16:33.369699 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" event={"ID":"d13fdc97-d080-45c4-a03e-12e51a0c85bf","Type":"ContainerStarted","Data":"2a517b4be571b2c1a42b29ccb02449483764beabfb0b4ae68ba04776e7305746"} Feb 27 06:16:34 crc kubenswrapper[4725]: I0227 06:16:34.007565 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 06:16:34 crc kubenswrapper[4725]: I0227 06:16:34.380123 4725 generic.go:334] "Generic (PLEG): container finished" podID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerID="886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec" exitCode=0 Feb 27 06:16:34 crc kubenswrapper[4725]: I0227 06:16:34.380260 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" event={"ID":"6b2da58a-3e24-4e72-a25d-eeee730910cd","Type":"ContainerDied","Data":"886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec"} Feb 27 06:16:34 crc kubenswrapper[4725]: I0227 06:16:34.380795 4725 scope.go:117] "RemoveContainer" containerID="886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec" Feb 27 06:16:34 crc kubenswrapper[4725]: I0227 06:16:34.383897 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" event={"ID":"d13fdc97-d080-45c4-a03e-12e51a0c85bf","Type":"ContainerStarted","Data":"79ed38471fa17537cea18a8f940e4ebf006c264304ac1259edf5facf64740b87"} Feb 27 06:16:34 crc kubenswrapper[4725]: I0227 06:16:34.436235 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" podStartSLOduration=1.640596159 podStartE2EDuration="2.436205516s" podCreationTimestamp="2026-02-27 06:16:32 +0000 UTC" firstStartedPulling="2026-02-27 06:16:33.04609142 +0000 UTC m=+371.508711999" lastFinishedPulling="2026-02-27 06:16:33.841700757 +0000 UTC m=+372.304321356" observedRunningTime="2026-02-27 06:16:34.428967033 +0000 UTC m=+372.891587602" watchObservedRunningTime="2026-02-27 06:16:34.436205516 +0000 UTC m=+372.898826125" Feb 27 06:16:35 crc kubenswrapper[4725]: I0227 06:16:35.394610 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" event={"ID":"6b2da58a-3e24-4e72-a25d-eeee730910cd","Type":"ContainerStarted","Data":"62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7"} Feb 27 06:16:35 crc kubenswrapper[4725]: I0227 06:16:35.395401 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:16:35 crc kubenswrapper[4725]: I0227 06:16:35.397508 4725 generic.go:334] "Generic (PLEG): container finished" podID="d13fdc97-d080-45c4-a03e-12e51a0c85bf" containerID="79ed38471fa17537cea18a8f940e4ebf006c264304ac1259edf5facf64740b87" exitCode=0 Feb 27 06:16:35 crc kubenswrapper[4725]: I0227 06:16:35.397584 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" event={"ID":"d13fdc97-d080-45c4-a03e-12e51a0c85bf","Type":"ContainerDied","Data":"79ed38471fa17537cea18a8f940e4ebf006c264304ac1259edf5facf64740b87"} Feb 27 06:16:35 crc kubenswrapper[4725]: I0227 06:16:35.401866 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:16:36 crc kubenswrapper[4725]: I0227 06:16:36.769992 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" Feb 27 06:16:36 crc kubenswrapper[4725]: I0227 06:16:36.890795 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn9g9\" (UniqueName: \"kubernetes.io/projected/d13fdc97-d080-45c4-a03e-12e51a0c85bf-kube-api-access-rn9g9\") pod \"d13fdc97-d080-45c4-a03e-12e51a0c85bf\" (UID: \"d13fdc97-d080-45c4-a03e-12e51a0c85bf\") " Feb 27 06:16:36 crc kubenswrapper[4725]: I0227 06:16:36.898688 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13fdc97-d080-45c4-a03e-12e51a0c85bf-kube-api-access-rn9g9" (OuterVolumeSpecName: "kube-api-access-rn9g9") pod "d13fdc97-d080-45c4-a03e-12e51a0c85bf" (UID: "d13fdc97-d080-45c4-a03e-12e51a0c85bf"). InnerVolumeSpecName "kube-api-access-rn9g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:16:36 crc kubenswrapper[4725]: I0227 06:16:36.992851 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn9g9\" (UniqueName: \"kubernetes.io/projected/d13fdc97-d080-45c4-a03e-12e51a0c85bf-kube-api-access-rn9g9\") on node \"crc\" DevicePath \"\"" Feb 27 06:16:37 crc kubenswrapper[4725]: I0227 06:16:37.094949 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 06:16:37 crc kubenswrapper[4725]: I0227 06:16:37.413226 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" event={"ID":"d13fdc97-d080-45c4-a03e-12e51a0c85bf","Type":"ContainerDied","Data":"2a517b4be571b2c1a42b29ccb02449483764beabfb0b4ae68ba04776e7305746"} Feb 27 06:16:37 crc kubenswrapper[4725]: I0227 06:16:37.413258 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536216-tw6gq" Feb 27 06:16:37 crc kubenswrapper[4725]: I0227 06:16:37.413318 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a517b4be571b2c1a42b29ccb02449483764beabfb0b4ae68ba04776e7305746" Feb 27 06:16:39 crc kubenswrapper[4725]: I0227 06:16:39.437999 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 27 06:16:39 crc kubenswrapper[4725]: I0227 06:16:39.439682 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 06:16:39 crc kubenswrapper[4725]: I0227 06:16:39.440220 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 06:16:39 crc kubenswrapper[4725]: I0227 06:16:39.440261 4725 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a780a743d48108ddec8daf75814797a6a3bdcac68d4e3b4538263405a1f1baae" exitCode=137 Feb 27 06:16:39 crc kubenswrapper[4725]: I0227 06:16:39.440309 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a780a743d48108ddec8daf75814797a6a3bdcac68d4e3b4538263405a1f1baae"} Feb 27 06:16:39 crc kubenswrapper[4725]: I0227 06:16:39.440349 4725 scope.go:117] "RemoveContainer" containerID="643cccf040dc93dc0f16575b038f609f0def7aae41ec05915bf3daed645001ab" Feb 27 06:16:40 crc kubenswrapper[4725]: I0227 06:16:40.450427 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 27 06:16:40 crc kubenswrapper[4725]: I0227 06:16:40.452859 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 06:16:40 crc kubenswrapper[4725]: I0227 06:16:40.452961 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"980cc6c349930f6663cf91bc2600099d6102504b5e825a6735466d15eccc8904"} Feb 27 06:16:42 crc kubenswrapper[4725]: I0227 06:16:42.018775 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:16:49 crc kubenswrapper[4725]: I0227 06:16:49.217008 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:16:49 crc kubenswrapper[4725]: I0227 06:16:49.224922 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:16:49 crc kubenswrapper[4725]: I0227 06:16:49.238240 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 06:16:52 crc kubenswrapper[4725]: I0227 06:16:52.025150 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 06:17:02 crc kubenswrapper[4725]: I0227 06:17:02.556849 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:17:02 crc kubenswrapper[4725]: I0227 06:17:02.558167 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.387658 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p89j6"] Feb 27 06:17:27 crc kubenswrapper[4725]: E0227 06:17:27.388334 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13fdc97-d080-45c4-a03e-12e51a0c85bf" containerName="oc" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.388350 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13fdc97-d080-45c4-a03e-12e51a0c85bf" containerName="oc" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.388480 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13fdc97-d080-45c4-a03e-12e51a0c85bf" containerName="oc" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.388934 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.417519 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p89j6"] Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452226 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452299 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d481d1a7-d116-4dd0-8d67-65654d50228e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452333 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-registry-tls\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452365 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpw74\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-kube-api-access-bpw74\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452384 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d481d1a7-d116-4dd0-8d67-65654d50228e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452401 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-bound-sa-token\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452422 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d481d1a7-d116-4dd0-8d67-65654d50228e-trusted-ca\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.452454 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d481d1a7-d116-4dd0-8d67-65654d50228e-registry-certificates\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.507654 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.554089 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d481d1a7-d116-4dd0-8d67-65654d50228e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.554147 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-registry-tls\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.554189 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpw74\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-kube-api-access-bpw74\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.554213 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d481d1a7-d116-4dd0-8d67-65654d50228e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.554235 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-bound-sa-token\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.554262 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d481d1a7-d116-4dd0-8d67-65654d50228e-trusted-ca\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.554343 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d481d1a7-d116-4dd0-8d67-65654d50228e-registry-certificates\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.555545 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d481d1a7-d116-4dd0-8d67-65654d50228e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.555898 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d481d1a7-d116-4dd0-8d67-65654d50228e-registry-certificates\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.556047 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d481d1a7-d116-4dd0-8d67-65654d50228e-trusted-ca\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.563116 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-registry-tls\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.565246 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d481d1a7-d116-4dd0-8d67-65654d50228e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.575882 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpw74\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-kube-api-access-bpw74\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.584561 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d481d1a7-d116-4dd0-8d67-65654d50228e-bound-sa-token\") pod \"image-registry-66df7c8f76-p89j6\" (UID: \"d481d1a7-d116-4dd0-8d67-65654d50228e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:27 crc kubenswrapper[4725]: I0227 06:17:27.706825 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:28 crc kubenswrapper[4725]: I0227 06:17:28.607742 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p89j6"] Feb 27 06:17:28 crc kubenswrapper[4725]: W0227 06:17:28.618649 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd481d1a7_d116_4dd0_8d67_65654d50228e.slice/crio-6b472fa5d034930c64488d58295220ce53bead6f191a22c299f4f8276e615b7f WatchSource:0}: Error finding container 6b472fa5d034930c64488d58295220ce53bead6f191a22c299f4f8276e615b7f: Status 404 returned error can't find the container with id 6b472fa5d034930c64488d58295220ce53bead6f191a22c299f4f8276e615b7f Feb 27 06:17:28 crc kubenswrapper[4725]: I0227 06:17:28.768568 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" event={"ID":"d481d1a7-d116-4dd0-8d67-65654d50228e","Type":"ContainerStarted","Data":"6b472fa5d034930c64488d58295220ce53bead6f191a22c299f4f8276e615b7f"} Feb 27 06:17:29 crc kubenswrapper[4725]: I0227 06:17:29.778510 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" event={"ID":"d481d1a7-d116-4dd0-8d67-65654d50228e","Type":"ContainerStarted","Data":"f907406c5259c71bddb4ec838451440ce54c40391ee9ea9ffa01713920532512"} Feb 27 06:17:29 crc kubenswrapper[4725]: I0227 06:17:29.778896 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:29 crc kubenswrapper[4725]: I0227 06:17:29.817713 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" podStartSLOduration=2.817680369 podStartE2EDuration="2.817680369s" podCreationTimestamp="2026-02-27 06:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:17:29.810485207 +0000 UTC m=+428.273105856" watchObservedRunningTime="2026-02-27 06:17:29.817680369 +0000 UTC m=+428.280300978" Feb 27 06:17:32 crc kubenswrapper[4725]: I0227 06:17:32.554032 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:17:32 crc kubenswrapper[4725]: I0227 06:17:32.554120 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:17:47 crc kubenswrapper[4725]: I0227 06:17:47.714217 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p89j6" Feb 27 06:17:47 crc kubenswrapper[4725]: I0227 06:17:47.793495 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p26pd"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.198266 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6cpp"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.199560 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b6cpp" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="registry-server" containerID="cri-o://997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2" gracePeriod=30 Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.222246 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-htnhk"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.222751 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-htnhk" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="registry-server" containerID="cri-o://13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f" gracePeriod=30 Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.244465 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmwz5"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.244886 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" containerID="cri-o://62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7" gracePeriod=30 Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.250606 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q5b8"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.250927 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2q5b8" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="registry-server" containerID="cri-o://73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404" gracePeriod=30 Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.267245 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b4kzj"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.268328 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.278608 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86fp7"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.278977 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-86fp7" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="registry-server" containerID="cri-o://8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639" gracePeriod=30 Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.292391 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b4kzj"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.370366 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.370606 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbr29\" (UniqueName: \"kubernetes.io/projected/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-kube-api-access-sbr29\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.370664 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.471952 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbr29\" (UniqueName: \"kubernetes.io/projected/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-kube-api-access-sbr29\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.472002 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.472066 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.473534 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.480046 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.488463 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbr29\" (UniqueName: \"kubernetes.io/projected/edee26dc-dc59-4500-8fe6-0f9f7e9c4546-kube-api-access-sbr29\") pod \"marketplace-operator-79b997595-b4kzj\" (UID: \"edee26dc-dc59-4500-8fe6-0f9f7e9c4546\") " pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.693747 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.696605 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.701956 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.711988 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.714064 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.733028 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.876178 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-catalog-content\") pod \"c008fcf9-f898-434a-b077-f8921e01be05\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.876239 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-utilities\") pod \"c008fcf9-f898-434a-b077-f8921e01be05\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.876280 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/c008fcf9-f898-434a-b077-f8921e01be05-kube-api-access-9wwv9\") pod \"c008fcf9-f898-434a-b077-f8921e01be05\" (UID: \"c008fcf9-f898-434a-b077-f8921e01be05\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.876342 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-catalog-content\") pod \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.876366 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-trusted-ca\") pod \"6b2da58a-3e24-4e72-a25d-eeee730910cd\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.876397 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-utilities\") pod \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877400 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz9d2\" (UniqueName: \"kubernetes.io/projected/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-kube-api-access-kz9d2\") pod \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877434 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-operator-metrics\") pod \"6b2da58a-3e24-4e72-a25d-eeee730910cd\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877478 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7mbl\" (UniqueName: \"kubernetes.io/projected/1295a124-164c-403c-8eb6-f71c3a9dc8a7-kube-api-access-s7mbl\") pod \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877505 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-catalog-content\") pod \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877509 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6b2da58a-3e24-4e72-a25d-eeee730910cd" (UID: "6b2da58a-3e24-4e72-a25d-eeee730910cd"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877533 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-utilities" (OuterVolumeSpecName: "utilities") pod "c008fcf9-f898-434a-b077-f8921e01be05" (UID: "c008fcf9-f898-434a-b077-f8921e01be05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877533 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv7j4\" (UniqueName: \"kubernetes.io/projected/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-kube-api-access-zv7j4\") pod \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\" (UID: \"4e0ac478-aa78-481d-84d3-f4a5c6bedadb\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877633 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-catalog-content\") pod \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877662 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-utilities\") pod \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\" (UID: \"1295a124-164c-403c-8eb6-f71c3a9dc8a7\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877710 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l5hm\" (UniqueName: \"kubernetes.io/projected/6b2da58a-3e24-4e72-a25d-eeee730910cd-kube-api-access-4l5hm\") pod \"6b2da58a-3e24-4e72-a25d-eeee730910cd\" (UID: \"6b2da58a-3e24-4e72-a25d-eeee730910cd\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877759 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-utilities\") pod \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\" (UID: \"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd\") " Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.877699 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-utilities" (OuterVolumeSpecName: "utilities") pod "4e0ac478-aa78-481d-84d3-f4a5c6bedadb" (UID: "4e0ac478-aa78-481d-84d3-f4a5c6bedadb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.878417 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.878438 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.878451 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.879056 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-utilities" (OuterVolumeSpecName: "utilities") pod "1295a124-164c-403c-8eb6-f71c3a9dc8a7" (UID: "1295a124-164c-403c-8eb6-f71c3a9dc8a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.879964 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-utilities" (OuterVolumeSpecName: "utilities") pod "c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" (UID: "c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.882213 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-kube-api-access-kz9d2" (OuterVolumeSpecName: "kube-api-access-kz9d2") pod "c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" (UID: "c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd"). InnerVolumeSpecName "kube-api-access-kz9d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.889511 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-kube-api-access-zv7j4" (OuterVolumeSpecName: "kube-api-access-zv7j4") pod "4e0ac478-aa78-481d-84d3-f4a5c6bedadb" (UID: "4e0ac478-aa78-481d-84d3-f4a5c6bedadb"). InnerVolumeSpecName "kube-api-access-zv7j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.889629 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c008fcf9-f898-434a-b077-f8921e01be05-kube-api-access-9wwv9" (OuterVolumeSpecName: "kube-api-access-9wwv9") pod "c008fcf9-f898-434a-b077-f8921e01be05" (UID: "c008fcf9-f898-434a-b077-f8921e01be05"). InnerVolumeSpecName "kube-api-access-9wwv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.889635 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6b2da58a-3e24-4e72-a25d-eeee730910cd" (UID: "6b2da58a-3e24-4e72-a25d-eeee730910cd"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.892431 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1295a124-164c-403c-8eb6-f71c3a9dc8a7-kube-api-access-s7mbl" (OuterVolumeSpecName: "kube-api-access-s7mbl") pod "1295a124-164c-403c-8eb6-f71c3a9dc8a7" (UID: "1295a124-164c-403c-8eb6-f71c3a9dc8a7"). InnerVolumeSpecName "kube-api-access-s7mbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.900147 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2da58a-3e24-4e72-a25d-eeee730910cd-kube-api-access-4l5hm" (OuterVolumeSpecName: "kube-api-access-4l5hm") pod "6b2da58a-3e24-4e72-a25d-eeee730910cd" (UID: "6b2da58a-3e24-4e72-a25d-eeee730910cd"). InnerVolumeSpecName "kube-api-access-4l5hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.907829 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c008fcf9-f898-434a-b077-f8921e01be05" (UID: "c008fcf9-f898-434a-b077-f8921e01be05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.944268 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b4kzj"] Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.950162 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e0ac478-aa78-481d-84d3-f4a5c6bedadb" (UID: "4e0ac478-aa78-481d-84d3-f4a5c6bedadb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.970237 4725 generic.go:334] "Generic (PLEG): container finished" podID="c008fcf9-f898-434a-b077-f8921e01be05" containerID="73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404" exitCode=0 Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.970337 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q5b8" event={"ID":"c008fcf9-f898-434a-b077-f8921e01be05","Type":"ContainerDied","Data":"73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404"} Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.970377 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q5b8" event={"ID":"c008fcf9-f898-434a-b077-f8921e01be05","Type":"ContainerDied","Data":"1c40e479553f18fc6aa4daa59865cc197c61916a8d2284e94023f40aa6defb2e"} Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.970398 4725 scope.go:117] "RemoveContainer" containerID="73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.970548 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q5b8" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979178 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" event={"ID":"edee26dc-dc59-4500-8fe6-0f9f7e9c4546","Type":"ContainerStarted","Data":"0c62ac3fb66559f3fe8b4d8b24fc27238671038092bd87ca61cb47c61415fb27"} Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979474 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/c008fcf9-f898-434a-b077-f8921e01be05-kube-api-access-9wwv9\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979517 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz9d2\" (UniqueName: \"kubernetes.io/projected/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-kube-api-access-kz9d2\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979534 4725 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6b2da58a-3e24-4e72-a25d-eeee730910cd-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979549 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7mbl\" (UniqueName: \"kubernetes.io/projected/1295a124-164c-403c-8eb6-f71c3a9dc8a7-kube-api-access-s7mbl\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979564 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979576 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv7j4\" (UniqueName: \"kubernetes.io/projected/4e0ac478-aa78-481d-84d3-f4a5c6bedadb-kube-api-access-zv7j4\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979589 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979603 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l5hm\" (UniqueName: \"kubernetes.io/projected/6b2da58a-3e24-4e72-a25d-eeee730910cd-kube-api-access-4l5hm\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979613 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.979625 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c008fcf9-f898-434a-b077-f8921e01be05-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.980975 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" (UID: "c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.987729 4725 generic.go:334] "Generic (PLEG): container finished" podID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerID="62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7" exitCode=0 Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.987776 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" event={"ID":"6b2da58a-3e24-4e72-a25d-eeee730910cd","Type":"ContainerDied","Data":"62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7"} Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.987804 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.987834 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmwz5" event={"ID":"6b2da58a-3e24-4e72-a25d-eeee730910cd","Type":"ContainerDied","Data":"d52afecf7df04841133babf90ab4d4798d3e09681c6f2f8a09036b464c298397"} Feb 27 06:17:56 crc kubenswrapper[4725]: I0227 06:17:56.997259 4725 scope.go:117] "RemoveContainer" containerID="e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.004749 4725 generic.go:334] "Generic (PLEG): container finished" podID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerID="997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2" exitCode=0 Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.004833 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6cpp" event={"ID":"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd","Type":"ContainerDied","Data":"997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2"} Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.004873 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6cpp" event={"ID":"c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd","Type":"ContainerDied","Data":"09eb3998fde19b40b7e31b963286b117407ae1faadef4cb1156a08f353899e60"} Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.004942 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6cpp" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.029898 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1295a124-164c-403c-8eb6-f71c3a9dc8a7" (UID: "1295a124-164c-403c-8eb6-f71c3a9dc8a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.030743 4725 generic.go:334] "Generic (PLEG): container finished" podID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerID="13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f" exitCode=0 Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.030928 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htnhk" event={"ID":"4e0ac478-aa78-481d-84d3-f4a5c6bedadb","Type":"ContainerDied","Data":"13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f"} Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.030966 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-htnhk" event={"ID":"4e0ac478-aa78-481d-84d3-f4a5c6bedadb","Type":"ContainerDied","Data":"ed4e30b6b4845151a4ac08e4f175981cc02d10a72d2e2342f38e4800eee84cdc"} Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.031140 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-htnhk" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.033436 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q5b8"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.043856 4725 scope.go:117] "RemoveContainer" containerID="a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.045925 4725 generic.go:334] "Generic (PLEG): container finished" podID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerID="8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639" exitCode=0 Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.045962 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86fp7" event={"ID":"1295a124-164c-403c-8eb6-f71c3a9dc8a7","Type":"ContainerDied","Data":"8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639"} Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.045989 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86fp7" event={"ID":"1295a124-164c-403c-8eb6-f71c3a9dc8a7","Type":"ContainerDied","Data":"5ec958d1581966cf1613ed6f2bc01e41c40a90cf2eaa2430593de66b1eb27efc"} Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.046065 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86fp7" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.053228 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q5b8"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.066844 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmwz5"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.073397 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmwz5"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.075916 4725 scope.go:117] "RemoveContainer" containerID="73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.077684 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404\": container with ID starting with 73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404 not found: ID does not exist" containerID="73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.077754 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404"} err="failed to get container status \"73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404\": rpc error: code = NotFound desc = could not find container \"73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404\": container with ID starting with 73d82a96c1e4252dfbc0472d5840071cbe6e016a6c6c9d4c618a5f020f4b0404 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.077791 4725 scope.go:117] "RemoveContainer" containerID="e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.079609 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8\": container with ID starting with e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8 not found: ID does not exist" containerID="e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.079629 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8"} err="failed to get container status \"e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8\": rpc error: code = NotFound desc = could not find container \"e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8\": container with ID starting with e8635b616eb01b002aac1ff31087bccdbec1d2eeed6979f427a75ba2c7d3cec8 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.079643 4725 scope.go:117] "RemoveContainer" containerID="a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.079882 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215\": container with ID starting with a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215 not found: ID does not exist" containerID="a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.079904 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215"} err="failed to get container status \"a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215\": rpc error: code = NotFound desc = could not find container \"a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215\": container with ID starting with a74345d30834d7bca7fab2e96041f7c0c688bebf412816e54a2fd66b5116d215 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.079919 4725 scope.go:117] "RemoveContainer" containerID="62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.080868 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.080886 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1295a124-164c-403c-8eb6-f71c3a9dc8a7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.083767 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6cpp"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.097503 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b6cpp"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.101989 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-htnhk"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.103997 4725 scope.go:117] "RemoveContainer" containerID="886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.105396 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-htnhk"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.115709 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86fp7"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.120541 4725 scope.go:117] "RemoveContainer" containerID="62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.120961 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7\": container with ID starting with 62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7 not found: ID does not exist" containerID="62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.121001 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7"} err="failed to get container status \"62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7\": rpc error: code = NotFound desc = could not find container \"62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7\": container with ID starting with 62380905b19eec1c3b5d67e7d9bd65333f2f97ababa716d309a37b9e616c49f7 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.121028 4725 scope.go:117] "RemoveContainer" containerID="886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.121327 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec\": container with ID starting with 886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec not found: ID does not exist" containerID="886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.121355 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec"} err="failed to get container status \"886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec\": rpc error: code = NotFound desc = could not find container \"886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec\": container with ID starting with 886914e07c983cf74f1cf53511cbe91947a29e2c73271d01543ab91352d6a1ec not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.121374 4725 scope.go:117] "RemoveContainer" containerID="997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.131469 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-86fp7"] Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.137435 4725 scope.go:117] "RemoveContainer" containerID="55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.151442 4725 scope.go:117] "RemoveContainer" containerID="2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.165569 4725 scope.go:117] "RemoveContainer" containerID="997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.166513 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2\": container with ID starting with 997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2 not found: ID does not exist" containerID="997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.166549 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2"} err="failed to get container status \"997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2\": rpc error: code = NotFound desc = could not find container \"997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2\": container with ID starting with 997955bb5e9603673c9eca82fd36152588a0aa02cd73eb70ff79eaff7b2c1dc2 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.166571 4725 scope.go:117] "RemoveContainer" containerID="55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.166940 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3\": container with ID starting with 55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3 not found: ID does not exist" containerID="55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.166965 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3"} err="failed to get container status \"55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3\": rpc error: code = NotFound desc = could not find container \"55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3\": container with ID starting with 55be79695c278d56e9276bda42c9e9947a4015d34ce75b2c307dd1e307d1e9a3 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.166978 4725 scope.go:117] "RemoveContainer" containerID="2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.167209 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce\": container with ID starting with 2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce not found: ID does not exist" containerID="2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.167230 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce"} err="failed to get container status \"2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce\": rpc error: code = NotFound desc = could not find container \"2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce\": container with ID starting with 2ef31297f477d6aca4162a2c5bd58e3675e685c14de4122f172acc4aea192dce not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.167244 4725 scope.go:117] "RemoveContainer" containerID="13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.184014 4725 scope.go:117] "RemoveContainer" containerID="c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.204538 4725 scope.go:117] "RemoveContainer" containerID="228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.220171 4725 scope.go:117] "RemoveContainer" containerID="13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.220549 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f\": container with ID starting with 13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f not found: ID does not exist" containerID="13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.220602 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f"} err="failed to get container status \"13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f\": rpc error: code = NotFound desc = could not find container \"13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f\": container with ID starting with 13b637344665f5e169dbbbafae94d45a5029414597345290530f3dab67a9434f not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.220636 4725 scope.go:117] "RemoveContainer" containerID="c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.220915 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e\": container with ID starting with c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e not found: ID does not exist" containerID="c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.221027 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e"} err="failed to get container status \"c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e\": rpc error: code = NotFound desc = could not find container \"c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e\": container with ID starting with c74650371a06b8ef04f55c8332650b8808de77916ed48f12f4d3433eec890a5e not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.221104 4725 scope.go:117] "RemoveContainer" containerID="228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.221435 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942\": container with ID starting with 228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942 not found: ID does not exist" containerID="228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.221455 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942"} err="failed to get container status \"228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942\": rpc error: code = NotFound desc = could not find container \"228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942\": container with ID starting with 228fed428064c48cf66b96650daf518d860f3b2ee800b7ca6b63146b43247942 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.221468 4725 scope.go:117] "RemoveContainer" containerID="8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.234989 4725 scope.go:117] "RemoveContainer" containerID="6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.246906 4725 scope.go:117] "RemoveContainer" containerID="416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.258107 4725 scope.go:117] "RemoveContainer" containerID="8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.258579 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639\": container with ID starting with 8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639 not found: ID does not exist" containerID="8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.258630 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639"} err="failed to get container status \"8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639\": rpc error: code = NotFound desc = could not find container \"8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639\": container with ID starting with 8475ae8e81a3be6a5f2da80e93adb95558f5104f51ab82b3df6eb33fa7d17639 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.258662 4725 scope.go:117] "RemoveContainer" containerID="6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.258929 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9\": container with ID starting with 6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9 not found: ID does not exist" containerID="6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.258960 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9"} err="failed to get container status \"6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9\": rpc error: code = NotFound desc = could not find container \"6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9\": container with ID starting with 6511d11418ebcfc40a1b58cec9d57701e55746e7534d077f1452f8811d3fc6d9 not found: ID does not exist" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.258981 4725 scope.go:117] "RemoveContainer" containerID="416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5" Feb 27 06:17:57 crc kubenswrapper[4725]: E0227 06:17:57.259250 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5\": container with ID starting with 416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5 not found: ID does not exist" containerID="416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5" Feb 27 06:17:57 crc kubenswrapper[4725]: I0227 06:17:57.259280 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5"} err="failed to get container status \"416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5\": rpc error: code = NotFound desc = could not find container \"416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5\": container with ID starting with 416c904fde0048f4580b44442d2261f40db699812640c8d3365396f4de0868d5 not found: ID does not exist" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.058416 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" event={"ID":"edee26dc-dc59-4500-8fe6-0f9f7e9c4546","Type":"ContainerStarted","Data":"b07846fe1ccb4155d6f138babfd3ccb84f87928d1e4256f343f1ccbf51c0d7f3"} Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.058739 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.064169 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.094231 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-b4kzj" podStartSLOduration=2.094196673 podStartE2EDuration="2.094196673s" podCreationTimestamp="2026-02-27 06:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:17:58.085412689 +0000 UTC m=+456.548033268" watchObservedRunningTime="2026-02-27 06:17:58.094196673 +0000 UTC m=+456.556817292" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.258766 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" path="/var/lib/kubelet/pods/1295a124-164c-403c-8eb6-f71c3a9dc8a7/volumes" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.259590 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" path="/var/lib/kubelet/pods/4e0ac478-aa78-481d-84d3-f4a5c6bedadb/volumes" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.260967 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" path="/var/lib/kubelet/pods/6b2da58a-3e24-4e72-a25d-eeee730910cd/volumes" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.262452 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c008fcf9-f898-434a-b077-f8921e01be05" path="/var/lib/kubelet/pods/c008fcf9-f898-434a-b077-f8921e01be05/volumes" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.263213 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" path="/var/lib/kubelet/pods/c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd/volumes" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.418883 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gnhx6"] Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419114 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419129 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419144 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419152 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419164 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419172 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419186 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419194 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419205 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419214 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419226 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419235 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419250 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419258 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419270 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419278 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419334 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419342 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419358 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419366 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419377 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419385 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419400 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419408 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="extract-content" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419421 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419429 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="extract-utilities" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419546 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1295a124-164c-403c-8eb6-f71c3a9dc8a7" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419558 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419571 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0ac478-aa78-481d-84d3-f4a5c6bedadb" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419583 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419594 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c008fcf9-f898-434a-b077-f8921e01be05" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419612 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e4b534-f1df-4ed2-86b4-5e8bae5ae3bd" containerName="registry-server" Feb 27 06:17:58 crc kubenswrapper[4725]: E0227 06:17:58.419718 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.419727 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2da58a-3e24-4e72-a25d-eeee730910cd" containerName="marketplace-operator" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.420513 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.424564 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.439845 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnhx6"] Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.599115 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-catalog-content\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.599181 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-utilities\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.599211 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfw4\" (UniqueName: \"kubernetes.io/projected/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-kube-api-access-jwfw4\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.608690 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5vq6n"] Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.611965 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.613801 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.619616 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vq6n"] Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.701117 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e5d8d3-00b5-4799-afd9-d360d58aee21-catalog-content\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.701213 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-catalog-content\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.701255 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-utilities\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.701302 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfw4\" (UniqueName: \"kubernetes.io/projected/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-kube-api-access-jwfw4\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.701328 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e5d8d3-00b5-4799-afd9-d360d58aee21-utilities\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.701353 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97ld\" (UniqueName: \"kubernetes.io/projected/a5e5d8d3-00b5-4799-afd9-d360d58aee21-kube-api-access-x97ld\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.701729 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-catalog-content\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.702173 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-utilities\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.727231 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfw4\" (UniqueName: \"kubernetes.io/projected/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-kube-api-access-jwfw4\") pod \"certified-operators-gnhx6\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.756076 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.802417 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e5d8d3-00b5-4799-afd9-d360d58aee21-utilities\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.802488 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x97ld\" (UniqueName: \"kubernetes.io/projected/a5e5d8d3-00b5-4799-afd9-d360d58aee21-kube-api-access-x97ld\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.802563 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e5d8d3-00b5-4799-afd9-d360d58aee21-catalog-content\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.803277 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e5d8d3-00b5-4799-afd9-d360d58aee21-utilities\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.803502 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e5d8d3-00b5-4799-afd9-d360d58aee21-catalog-content\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.822595 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x97ld\" (UniqueName: \"kubernetes.io/projected/a5e5d8d3-00b5-4799-afd9-d360d58aee21-kube-api-access-x97ld\") pod \"redhat-marketplace-5vq6n\" (UID: \"a5e5d8d3-00b5-4799-afd9-d360d58aee21\") " pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:58 crc kubenswrapper[4725]: I0227 06:17:58.949311 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:17:59 crc kubenswrapper[4725]: I0227 06:17:59.179226 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gnhx6"] Feb 27 06:17:59 crc kubenswrapper[4725]: W0227 06:17:59.183654 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb9bdb7d_d8b8_4937_9970_f32ee3abe121.slice/crio-5de2619278768d736effbd06aa3cef62c227307e79b191ec495497c096293a08 WatchSource:0}: Error finding container 5de2619278768d736effbd06aa3cef62c227307e79b191ec495497c096293a08: Status 404 returned error can't find the container with id 5de2619278768d736effbd06aa3cef62c227307e79b191ec495497c096293a08 Feb 27 06:17:59 crc kubenswrapper[4725]: I0227 06:17:59.192036 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vq6n"] Feb 27 06:17:59 crc kubenswrapper[4725]: W0227 06:17:59.200245 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e5d8d3_00b5_4799_afd9_d360d58aee21.slice/crio-078c4166050256c80e469e8c071dc8d52e0e49d1ba47dcfc612594b762b2b453 WatchSource:0}: Error finding container 078c4166050256c80e469e8c071dc8d52e0e49d1ba47dcfc612594b762b2b453: Status 404 returned error can't find the container with id 078c4166050256c80e469e8c071dc8d52e0e49d1ba47dcfc612594b762b2b453 Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.080002 4725 generic.go:334] "Generic (PLEG): container finished" podID="a5e5d8d3-00b5-4799-afd9-d360d58aee21" containerID="161c7c67eee0c938be83bc0814ac436c1597043db586f9b37e1d009e15f0f49d" exitCode=0 Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.080187 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vq6n" event={"ID":"a5e5d8d3-00b5-4799-afd9-d360d58aee21","Type":"ContainerDied","Data":"161c7c67eee0c938be83bc0814ac436c1597043db586f9b37e1d009e15f0f49d"} Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.083874 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vq6n" event={"ID":"a5e5d8d3-00b5-4799-afd9-d360d58aee21","Type":"ContainerStarted","Data":"078c4166050256c80e469e8c071dc8d52e0e49d1ba47dcfc612594b762b2b453"} Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.087694 4725 generic.go:334] "Generic (PLEG): container finished" podID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerID="8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e" exitCode=0 Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.087760 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnhx6" event={"ID":"eb9bdb7d-d8b8-4937-9970-f32ee3abe121","Type":"ContainerDied","Data":"8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e"} Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.087820 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnhx6" event={"ID":"eb9bdb7d-d8b8-4937-9970-f32ee3abe121","Type":"ContainerStarted","Data":"5de2619278768d736effbd06aa3cef62c227307e79b191ec495497c096293a08"} Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.188781 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536218-dqxfn"] Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.189915 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536218-dqxfn" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.194089 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.194310 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.195657 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.197848 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536218-dqxfn"] Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.325214 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s66tr\" (UniqueName: \"kubernetes.io/projected/7839ac25-2229-4c28-afd8-f8f6e997a018-kube-api-access-s66tr\") pod \"auto-csr-approver-29536218-dqxfn\" (UID: \"7839ac25-2229-4c28-afd8-f8f6e997a018\") " pod="openshift-infra/auto-csr-approver-29536218-dqxfn" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.426960 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s66tr\" (UniqueName: \"kubernetes.io/projected/7839ac25-2229-4c28-afd8-f8f6e997a018-kube-api-access-s66tr\") pod \"auto-csr-approver-29536218-dqxfn\" (UID: \"7839ac25-2229-4c28-afd8-f8f6e997a018\") " pod="openshift-infra/auto-csr-approver-29536218-dqxfn" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.449245 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s66tr\" (UniqueName: \"kubernetes.io/projected/7839ac25-2229-4c28-afd8-f8f6e997a018-kube-api-access-s66tr\") pod \"auto-csr-approver-29536218-dqxfn\" (UID: \"7839ac25-2229-4c28-afd8-f8f6e997a018\") " pod="openshift-infra/auto-csr-approver-29536218-dqxfn" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.518444 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536218-dqxfn" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.786700 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536218-dqxfn"] Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.821776 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dw9mm"] Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.822874 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.824802 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.837097 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dw9mm"] Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.935588 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wffbs\" (UniqueName: \"kubernetes.io/projected/8dc5591a-ae2b-4664-8c09-216b72be4a2e-kube-api-access-wffbs\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.935905 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc5591a-ae2b-4664-8c09-216b72be4a2e-utilities\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:00 crc kubenswrapper[4725]: I0227 06:18:00.935966 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc5591a-ae2b-4664-8c09-216b72be4a2e-catalog-content\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.030071 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zptvt"] Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.032116 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.036815 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc5591a-ae2b-4664-8c09-216b72be4a2e-catalog-content\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.036862 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wffbs\" (UniqueName: \"kubernetes.io/projected/8dc5591a-ae2b-4664-8c09-216b72be4a2e-kube-api-access-wffbs\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.036891 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc5591a-ae2b-4664-8c09-216b72be4a2e-utilities\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.037408 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dc5591a-ae2b-4664-8c09-216b72be4a2e-utilities\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.037714 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dc5591a-ae2b-4664-8c09-216b72be4a2e-catalog-content\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.037753 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.038950 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zptvt"] Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.075112 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wffbs\" (UniqueName: \"kubernetes.io/projected/8dc5591a-ae2b-4664-8c09-216b72be4a2e-kube-api-access-wffbs\") pod \"redhat-operators-dw9mm\" (UID: \"8dc5591a-ae2b-4664-8c09-216b72be4a2e\") " pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.115038 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536218-dqxfn" event={"ID":"7839ac25-2229-4c28-afd8-f8f6e997a018","Type":"ContainerStarted","Data":"207f718c88bb747532d159af6b74a854621963d1335941193b7a62ed8867e734"} Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.116782 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vq6n" event={"ID":"a5e5d8d3-00b5-4799-afd9-d360d58aee21","Type":"ContainerStarted","Data":"5e75eca09fd63f8007da33d99724acd513d55ebe236fa00566ddb57dd92db76d"} Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.119011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnhx6" event={"ID":"eb9bdb7d-d8b8-4937-9970-f32ee3abe121","Type":"ContainerStarted","Data":"6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4"} Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.139042 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttc99\" (UniqueName: \"kubernetes.io/projected/aca14ae6-3333-4838-9732-be9096c892ac-kube-api-access-ttc99\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.139163 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-catalog-content\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.139203 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-utilities\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.206031 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.242651 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttc99\" (UniqueName: \"kubernetes.io/projected/aca14ae6-3333-4838-9732-be9096c892ac-kube-api-access-ttc99\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.242784 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-catalog-content\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.242812 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-utilities\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.243476 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-utilities\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.243718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-catalog-content\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.260985 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttc99\" (UniqueName: \"kubernetes.io/projected/aca14ae6-3333-4838-9732-be9096c892ac-kube-api-access-ttc99\") pod \"community-operators-zptvt\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.368146 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.432145 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dw9mm"] Feb 27 06:18:01 crc kubenswrapper[4725]: I0227 06:18:01.553801 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zptvt"] Feb 27 06:18:01 crc kubenswrapper[4725]: W0227 06:18:01.559831 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca14ae6_3333_4838_9732_be9096c892ac.slice/crio-c21617c655c31a0d947fb49ac90d6fb86a014c808015f733fcac48aa354b956e WatchSource:0}: Error finding container c21617c655c31a0d947fb49ac90d6fb86a014c808015f733fcac48aa354b956e: Status 404 returned error can't find the container with id c21617c655c31a0d947fb49ac90d6fb86a014c808015f733fcac48aa354b956e Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.127332 4725 generic.go:334] "Generic (PLEG): container finished" podID="aca14ae6-3333-4838-9732-be9096c892ac" containerID="05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80" exitCode=0 Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.127455 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zptvt" event={"ID":"aca14ae6-3333-4838-9732-be9096c892ac","Type":"ContainerDied","Data":"05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80"} Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.128455 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zptvt" event={"ID":"aca14ae6-3333-4838-9732-be9096c892ac","Type":"ContainerStarted","Data":"c21617c655c31a0d947fb49ac90d6fb86a014c808015f733fcac48aa354b956e"} Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.131409 4725 generic.go:334] "Generic (PLEG): container finished" podID="8dc5591a-ae2b-4664-8c09-216b72be4a2e" containerID="39d2d11ddbeb89e2bc82e198f54069983d0f3342954445b895954b95066b8cc8" exitCode=0 Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.131481 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dw9mm" event={"ID":"8dc5591a-ae2b-4664-8c09-216b72be4a2e","Type":"ContainerDied","Data":"39d2d11ddbeb89e2bc82e198f54069983d0f3342954445b895954b95066b8cc8"} Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.131509 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dw9mm" event={"ID":"8dc5591a-ae2b-4664-8c09-216b72be4a2e","Type":"ContainerStarted","Data":"1cc5c94e4be14838691587300782e0f6fa5ef09e75f1c7a877496498e5cd017d"} Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.138437 4725 generic.go:334] "Generic (PLEG): container finished" podID="a5e5d8d3-00b5-4799-afd9-d360d58aee21" containerID="5e75eca09fd63f8007da33d99724acd513d55ebe236fa00566ddb57dd92db76d" exitCode=0 Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.138482 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vq6n" event={"ID":"a5e5d8d3-00b5-4799-afd9-d360d58aee21","Type":"ContainerDied","Data":"5e75eca09fd63f8007da33d99724acd513d55ebe236fa00566ddb57dd92db76d"} Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.147755 4725 generic.go:334] "Generic (PLEG): container finished" podID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerID="6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4" exitCode=0 Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.149074 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnhx6" event={"ID":"eb9bdb7d-d8b8-4937-9970-f32ee3abe121","Type":"ContainerDied","Data":"6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4"} Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.554348 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.554433 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.554501 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.555325 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f09b0a4f42ec4f4cb86c337e85961d890e6fd84143196dec401eb3a8f601acdd"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:18:02 crc kubenswrapper[4725]: I0227 06:18:02.555428 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://f09b0a4f42ec4f4cb86c337e85961d890e6fd84143196dec401eb3a8f601acdd" gracePeriod=600 Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.170136 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vq6n" event={"ID":"a5e5d8d3-00b5-4799-afd9-d360d58aee21","Type":"ContainerStarted","Data":"3ad15c710e0d9f1b0881fc034343f9f5fd846674095f67602aa54bccde59c7b7"} Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.173466 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnhx6" event={"ID":"eb9bdb7d-d8b8-4937-9970-f32ee3abe121","Type":"ContainerStarted","Data":"c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18"} Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.175996 4725 generic.go:334] "Generic (PLEG): container finished" podID="7839ac25-2229-4c28-afd8-f8f6e997a018" containerID="ddbf53cf5ff95e919b248f533f46b56c2c853b0f5bb658de3f2bda113960db36" exitCode=0 Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.176052 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536218-dqxfn" event={"ID":"7839ac25-2229-4c28-afd8-f8f6e997a018","Type":"ContainerDied","Data":"ddbf53cf5ff95e919b248f533f46b56c2c853b0f5bb658de3f2bda113960db36"} Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.178649 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="f09b0a4f42ec4f4cb86c337e85961d890e6fd84143196dec401eb3a8f601acdd" exitCode=0 Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.178724 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"f09b0a4f42ec4f4cb86c337e85961d890e6fd84143196dec401eb3a8f601acdd"} Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.178782 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"3de280f968967bd9a16dbc4f0f0a7eae44829783c4f2138d5e7a2fbcb79954ed"} Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.178802 4725 scope.go:117] "RemoveContainer" containerID="f218589ea5e3fecf4cdf15a258f92aad8759a47b2ad7955ab3b93549e5046b72" Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.180414 4725 generic.go:334] "Generic (PLEG): container finished" podID="aca14ae6-3333-4838-9732-be9096c892ac" containerID="c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1" exitCode=0 Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.180469 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zptvt" event={"ID":"aca14ae6-3333-4838-9732-be9096c892ac","Type":"ContainerDied","Data":"c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1"} Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.182478 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dw9mm" event={"ID":"8dc5591a-ae2b-4664-8c09-216b72be4a2e","Type":"ContainerStarted","Data":"f2d89c6804c2575f4ed57b9627be999c67283e75faef7db9828dd8f15c7f0f99"} Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.199910 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5vq6n" podStartSLOduration=2.48066289 podStartE2EDuration="5.199889397s" podCreationTimestamp="2026-02-27 06:17:58 +0000 UTC" firstStartedPulling="2026-02-27 06:18:00.082716482 +0000 UTC m=+458.545337091" lastFinishedPulling="2026-02-27 06:18:02.801943029 +0000 UTC m=+461.264563598" observedRunningTime="2026-02-27 06:18:03.191906834 +0000 UTC m=+461.654527403" watchObservedRunningTime="2026-02-27 06:18:03.199889397 +0000 UTC m=+461.662509966" Feb 27 06:18:03 crc kubenswrapper[4725]: I0227 06:18:03.214113 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gnhx6" podStartSLOduration=2.747184013 podStartE2EDuration="5.214087086s" podCreationTimestamp="2026-02-27 06:17:58 +0000 UTC" firstStartedPulling="2026-02-27 06:18:00.091160188 +0000 UTC m=+458.553780757" lastFinishedPulling="2026-02-27 06:18:02.558063231 +0000 UTC m=+461.020683830" observedRunningTime="2026-02-27 06:18:03.211776824 +0000 UTC m=+461.674397393" watchObservedRunningTime="2026-02-27 06:18:03.214087086 +0000 UTC m=+461.676707655" Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.190794 4725 generic.go:334] "Generic (PLEG): container finished" podID="8dc5591a-ae2b-4664-8c09-216b72be4a2e" containerID="f2d89c6804c2575f4ed57b9627be999c67283e75faef7db9828dd8f15c7f0f99" exitCode=0 Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.191028 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dw9mm" event={"ID":"8dc5591a-ae2b-4664-8c09-216b72be4a2e","Type":"ContainerDied","Data":"f2d89c6804c2575f4ed57b9627be999c67283e75faef7db9828dd8f15c7f0f99"} Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.203978 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zptvt" event={"ID":"aca14ae6-3333-4838-9732-be9096c892ac","Type":"ContainerStarted","Data":"6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e"} Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.243326 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zptvt" podStartSLOduration=2.532160391 podStartE2EDuration="4.243304139s" podCreationTimestamp="2026-02-27 06:18:00 +0000 UTC" firstStartedPulling="2026-02-27 06:18:02.130488213 +0000 UTC m=+460.593108802" lastFinishedPulling="2026-02-27 06:18:03.841631971 +0000 UTC m=+462.304252550" observedRunningTime="2026-02-27 06:18:04.241643444 +0000 UTC m=+462.704264013" watchObservedRunningTime="2026-02-27 06:18:04.243304139 +0000 UTC m=+462.705924698" Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.517040 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536218-dqxfn" Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.690122 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s66tr\" (UniqueName: \"kubernetes.io/projected/7839ac25-2229-4c28-afd8-f8f6e997a018-kube-api-access-s66tr\") pod \"7839ac25-2229-4c28-afd8-f8f6e997a018\" (UID: \"7839ac25-2229-4c28-afd8-f8f6e997a018\") " Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.703568 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7839ac25-2229-4c28-afd8-f8f6e997a018-kube-api-access-s66tr" (OuterVolumeSpecName: "kube-api-access-s66tr") pod "7839ac25-2229-4c28-afd8-f8f6e997a018" (UID: "7839ac25-2229-4c28-afd8-f8f6e997a018"). InnerVolumeSpecName "kube-api-access-s66tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:18:04 crc kubenswrapper[4725]: I0227 06:18:04.792925 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s66tr\" (UniqueName: \"kubernetes.io/projected/7839ac25-2229-4c28-afd8-f8f6e997a018-kube-api-access-s66tr\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:05 crc kubenswrapper[4725]: I0227 06:18:05.213114 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dw9mm" event={"ID":"8dc5591a-ae2b-4664-8c09-216b72be4a2e","Type":"ContainerStarted","Data":"19d894afc2785e9da54d28c9dabad0534dcad2c24c96f76165ae785ba88592de"} Feb 27 06:18:05 crc kubenswrapper[4725]: I0227 06:18:05.214411 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536218-dqxfn" Feb 27 06:18:05 crc kubenswrapper[4725]: I0227 06:18:05.214454 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536218-dqxfn" event={"ID":"7839ac25-2229-4c28-afd8-f8f6e997a018","Type":"ContainerDied","Data":"207f718c88bb747532d159af6b74a854621963d1335941193b7a62ed8867e734"} Feb 27 06:18:05 crc kubenswrapper[4725]: I0227 06:18:05.214527 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="207f718c88bb747532d159af6b74a854621963d1335941193b7a62ed8867e734" Feb 27 06:18:05 crc kubenswrapper[4725]: I0227 06:18:05.231390 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dw9mm" podStartSLOduration=2.6642186150000002 podStartE2EDuration="5.231373494s" podCreationTimestamp="2026-02-27 06:18:00 +0000 UTC" firstStartedPulling="2026-02-27 06:18:02.133118993 +0000 UTC m=+460.595739562" lastFinishedPulling="2026-02-27 06:18:04.700273862 +0000 UTC m=+463.162894441" observedRunningTime="2026-02-27 06:18:05.231318912 +0000 UTC m=+463.693939481" watchObservedRunningTime="2026-02-27 06:18:05.231373494 +0000 UTC m=+463.693994063" Feb 27 06:18:05 crc kubenswrapper[4725]: I0227 06:18:05.578123 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536212-627nh"] Feb 27 06:18:05 crc kubenswrapper[4725]: I0227 06:18:05.581921 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536212-627nh"] Feb 27 06:18:06 crc kubenswrapper[4725]: I0227 06:18:06.264485 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234512e0-3471-4bd8-b783-6df7b63f2cfe" path="/var/lib/kubelet/pods/234512e0-3471-4bd8-b783-6df7b63f2cfe/volumes" Feb 27 06:18:08 crc kubenswrapper[4725]: I0227 06:18:08.756834 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:18:08 crc kubenswrapper[4725]: I0227 06:18:08.757219 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:18:08 crc kubenswrapper[4725]: I0227 06:18:08.832824 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:18:08 crc kubenswrapper[4725]: I0227 06:18:08.949779 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:18:08 crc kubenswrapper[4725]: I0227 06:18:08.949855 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:18:09 crc kubenswrapper[4725]: I0227 06:18:09.012805 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:18:09 crc kubenswrapper[4725]: I0227 06:18:09.310126 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5vq6n" Feb 27 06:18:09 crc kubenswrapper[4725]: I0227 06:18:09.319121 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 06:18:11 crc kubenswrapper[4725]: I0227 06:18:11.207125 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:11 crc kubenswrapper[4725]: I0227 06:18:11.207218 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:11 crc kubenswrapper[4725]: I0227 06:18:11.368575 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:11 crc kubenswrapper[4725]: I0227 06:18:11.368647 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:11 crc kubenswrapper[4725]: I0227 06:18:11.418423 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:12 crc kubenswrapper[4725]: I0227 06:18:12.274700 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dw9mm" podUID="8dc5591a-ae2b-4664-8c09-216b72be4a2e" containerName="registry-server" probeResult="failure" output=< Feb 27 06:18:12 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:18:12 crc kubenswrapper[4725]: > Feb 27 06:18:12 crc kubenswrapper[4725]: I0227 06:18:12.310485 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:18:12 crc kubenswrapper[4725]: I0227 06:18:12.848508 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" podUID="a58d5af7-837b-45b1-a3cb-ffc3172f54e1" containerName="registry" containerID="cri-o://d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1" gracePeriod=30 Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.245262 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.268917 4725 generic.go:334] "Generic (PLEG): container finished" podID="a58d5af7-837b-45b1-a3cb-ffc3172f54e1" containerID="d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1" exitCode=0 Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.268979 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" event={"ID":"a58d5af7-837b-45b1-a3cb-ffc3172f54e1","Type":"ContainerDied","Data":"d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1"} Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.269053 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" event={"ID":"a58d5af7-837b-45b1-a3cb-ffc3172f54e1","Type":"ContainerDied","Data":"e0bd00c59d5bdb614f56ace5d9f940a23d33f77fed6aac059fb43edb4faf1961"} Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.269111 4725 scope.go:117] "RemoveContainer" containerID="d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.269353 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p26pd" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.287947 4725 scope.go:117] "RemoveContainer" containerID="d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1" Feb 27 06:18:13 crc kubenswrapper[4725]: E0227 06:18:13.288324 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1\": container with ID starting with d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1 not found: ID does not exist" containerID="d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.288352 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1"} err="failed to get container status \"d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1\": rpc error: code = NotFound desc = could not find container \"d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1\": container with ID starting with d8a8a2f388c3d7036f37d491e47154840e57976a3be3a2dfd6eabe26de228cf1 not found: ID does not exist" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.342046 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-certificates\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.342135 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-bound-sa-token\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.342180 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-tls\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.344508 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.356509 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.365646 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443052 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-installation-pull-secrets\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443095 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-trusted-ca\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443233 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443270 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-ca-trust-extracted\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443302 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkm57\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-kube-api-access-kkm57\") pod \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\" (UID: \"a58d5af7-837b-45b1-a3cb-ffc3172f54e1\") " Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443451 4725 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443462 4725 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443471 4725 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.443820 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.458799 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.459109 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.460057 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.460280 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-kube-api-access-kkm57" (OuterVolumeSpecName: "kube-api-access-kkm57") pod "a58d5af7-837b-45b1-a3cb-ffc3172f54e1" (UID: "a58d5af7-837b-45b1-a3cb-ffc3172f54e1"). InnerVolumeSpecName "kube-api-access-kkm57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.544113 4725 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.544147 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.544158 4725 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.544167 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkm57\" (UniqueName: \"kubernetes.io/projected/a58d5af7-837b-45b1-a3cb-ffc3172f54e1-kube-api-access-kkm57\") on node \"crc\" DevicePath \"\"" Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.608318 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p26pd"] Feb 27 06:18:13 crc kubenswrapper[4725]: I0227 06:18:13.612446 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p26pd"] Feb 27 06:18:14 crc kubenswrapper[4725]: I0227 06:18:14.261530 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58d5af7-837b-45b1-a3cb-ffc3172f54e1" path="/var/lib/kubelet/pods/a58d5af7-837b-45b1-a3cb-ffc3172f54e1/volumes" Feb 27 06:18:21 crc kubenswrapper[4725]: I0227 06:18:21.279181 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:18:21 crc kubenswrapper[4725]: I0227 06:18:21.339661 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dw9mm" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.141122 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536220-9dzkm"] Feb 27 06:20:00 crc kubenswrapper[4725]: E0227 06:20:00.142058 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7839ac25-2229-4c28-afd8-f8f6e997a018" containerName="oc" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.142080 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7839ac25-2229-4c28-afd8-f8f6e997a018" containerName="oc" Feb 27 06:20:00 crc kubenswrapper[4725]: E0227 06:20:00.142119 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58d5af7-837b-45b1-a3cb-ffc3172f54e1" containerName="registry" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.142133 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58d5af7-837b-45b1-a3cb-ffc3172f54e1" containerName="registry" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.142319 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58d5af7-837b-45b1-a3cb-ffc3172f54e1" containerName="registry" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.142350 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7839ac25-2229-4c28-afd8-f8f6e997a018" containerName="oc" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.142912 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536220-9dzkm" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.176560 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.177069 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.177080 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.191250 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536220-9dzkm"] Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.238707 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqbbd\" (UniqueName: \"kubernetes.io/projected/4606ec17-1d6b-4af7-b13b-10ed389a0987-kube-api-access-dqbbd\") pod \"auto-csr-approver-29536220-9dzkm\" (UID: \"4606ec17-1d6b-4af7-b13b-10ed389a0987\") " pod="openshift-infra/auto-csr-approver-29536220-9dzkm" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.340452 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqbbd\" (UniqueName: \"kubernetes.io/projected/4606ec17-1d6b-4af7-b13b-10ed389a0987-kube-api-access-dqbbd\") pod \"auto-csr-approver-29536220-9dzkm\" (UID: \"4606ec17-1d6b-4af7-b13b-10ed389a0987\") " pod="openshift-infra/auto-csr-approver-29536220-9dzkm" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.380414 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqbbd\" (UniqueName: \"kubernetes.io/projected/4606ec17-1d6b-4af7-b13b-10ed389a0987-kube-api-access-dqbbd\") pod \"auto-csr-approver-29536220-9dzkm\" (UID: \"4606ec17-1d6b-4af7-b13b-10ed389a0987\") " pod="openshift-infra/auto-csr-approver-29536220-9dzkm" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.506236 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536220-9dzkm" Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.787357 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536220-9dzkm"] Feb 27 06:20:00 crc kubenswrapper[4725]: I0227 06:20:00.803138 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:20:01 crc kubenswrapper[4725]: I0227 06:20:01.046966 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536220-9dzkm" event={"ID":"4606ec17-1d6b-4af7-b13b-10ed389a0987","Type":"ContainerStarted","Data":"eebf09d7da9be397d24831f8a24fb1b2241751a236a7b0bf81e080691e9aa05c"} Feb 27 06:20:02 crc kubenswrapper[4725]: I0227 06:20:02.554647 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:20:02 crc kubenswrapper[4725]: I0227 06:20:02.555025 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:20:03 crc kubenswrapper[4725]: I0227 06:20:03.067344 4725 generic.go:334] "Generic (PLEG): container finished" podID="4606ec17-1d6b-4af7-b13b-10ed389a0987" containerID="2aa66bce59acc5b8a9510321b95d44a04e9df3bed65dcc81dd688185f55a28f0" exitCode=0 Feb 27 06:20:03 crc kubenswrapper[4725]: I0227 06:20:03.067476 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536220-9dzkm" event={"ID":"4606ec17-1d6b-4af7-b13b-10ed389a0987","Type":"ContainerDied","Data":"2aa66bce59acc5b8a9510321b95d44a04e9df3bed65dcc81dd688185f55a28f0"} Feb 27 06:20:04 crc kubenswrapper[4725]: I0227 06:20:04.389565 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536220-9dzkm" Feb 27 06:20:04 crc kubenswrapper[4725]: I0227 06:20:04.516142 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqbbd\" (UniqueName: \"kubernetes.io/projected/4606ec17-1d6b-4af7-b13b-10ed389a0987-kube-api-access-dqbbd\") pod \"4606ec17-1d6b-4af7-b13b-10ed389a0987\" (UID: \"4606ec17-1d6b-4af7-b13b-10ed389a0987\") " Feb 27 06:20:04 crc kubenswrapper[4725]: I0227 06:20:04.525731 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4606ec17-1d6b-4af7-b13b-10ed389a0987-kube-api-access-dqbbd" (OuterVolumeSpecName: "kube-api-access-dqbbd") pod "4606ec17-1d6b-4af7-b13b-10ed389a0987" (UID: "4606ec17-1d6b-4af7-b13b-10ed389a0987"). InnerVolumeSpecName "kube-api-access-dqbbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:20:04 crc kubenswrapper[4725]: I0227 06:20:04.618101 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqbbd\" (UniqueName: \"kubernetes.io/projected/4606ec17-1d6b-4af7-b13b-10ed389a0987-kube-api-access-dqbbd\") on node \"crc\" DevicePath \"\"" Feb 27 06:20:05 crc kubenswrapper[4725]: I0227 06:20:05.084628 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536220-9dzkm" event={"ID":"4606ec17-1d6b-4af7-b13b-10ed389a0987","Type":"ContainerDied","Data":"eebf09d7da9be397d24831f8a24fb1b2241751a236a7b0bf81e080691e9aa05c"} Feb 27 06:20:05 crc kubenswrapper[4725]: I0227 06:20:05.084682 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eebf09d7da9be397d24831f8a24fb1b2241751a236a7b0bf81e080691e9aa05c" Feb 27 06:20:05 crc kubenswrapper[4725]: I0227 06:20:05.084689 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536220-9dzkm" Feb 27 06:20:05 crc kubenswrapper[4725]: I0227 06:20:05.464349 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536214-p7k5h"] Feb 27 06:20:05 crc kubenswrapper[4725]: I0227 06:20:05.467519 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536214-p7k5h"] Feb 27 06:20:06 crc kubenswrapper[4725]: I0227 06:20:06.262900 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c0abcb-fb62-4f62-b73e-a27620de9add" path="/var/lib/kubelet/pods/06c0abcb-fb62-4f62-b73e-a27620de9add/volumes" Feb 27 06:20:32 crc kubenswrapper[4725]: I0227 06:20:32.554825 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:20:32 crc kubenswrapper[4725]: I0227 06:20:32.556282 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:21:02 crc kubenswrapper[4725]: I0227 06:21:02.554803 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:21:02 crc kubenswrapper[4725]: I0227 06:21:02.555781 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:21:02 crc kubenswrapper[4725]: I0227 06:21:02.555957 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:21:02 crc kubenswrapper[4725]: I0227 06:21:02.557221 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3de280f968967bd9a16dbc4f0f0a7eae44829783c4f2138d5e7a2fbcb79954ed"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:21:02 crc kubenswrapper[4725]: I0227 06:21:02.557382 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://3de280f968967bd9a16dbc4f0f0a7eae44829783c4f2138d5e7a2fbcb79954ed" gracePeriod=600 Feb 27 06:21:03 crc kubenswrapper[4725]: I0227 06:21:03.541603 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="3de280f968967bd9a16dbc4f0f0a7eae44829783c4f2138d5e7a2fbcb79954ed" exitCode=0 Feb 27 06:21:03 crc kubenswrapper[4725]: I0227 06:21:03.541655 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"3de280f968967bd9a16dbc4f0f0a7eae44829783c4f2138d5e7a2fbcb79954ed"} Feb 27 06:21:03 crc kubenswrapper[4725]: I0227 06:21:03.542645 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"f1f293a213b8036dbb26c658eb7bfe019d4be19b3467bb55e8ef79040281b13b"} Feb 27 06:21:03 crc kubenswrapper[4725]: I0227 06:21:03.542685 4725 scope.go:117] "RemoveContainer" containerID="f09b0a4f42ec4f4cb86c337e85961d890e6fd84143196dec401eb3a8f601acdd" Feb 27 06:21:39 crc kubenswrapper[4725]: I0227 06:21:39.318000 4725 scope.go:117] "RemoveContainer" containerID="76c19525ba025394fb52d83c22e1eb7190d94dd6551569de289c5d63f566f2af" Feb 27 06:21:39 crc kubenswrapper[4725]: I0227 06:21:39.378706 4725 scope.go:117] "RemoveContainer" containerID="dec50bf79e349fe9abc1a4b764874fa1b281fe7a54a003aef33b4ec78a2900b6" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.151369 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536222-m8lnm"] Feb 27 06:22:00 crc kubenswrapper[4725]: E0227 06:22:00.152616 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4606ec17-1d6b-4af7-b13b-10ed389a0987" containerName="oc" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.152642 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4606ec17-1d6b-4af7-b13b-10ed389a0987" containerName="oc" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.152834 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4606ec17-1d6b-4af7-b13b-10ed389a0987" containerName="oc" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.153552 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536222-m8lnm" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.158845 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.159016 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.159054 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.171527 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536222-m8lnm"] Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.281866 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hflvb\" (UniqueName: \"kubernetes.io/projected/be5f726c-3d3b-4b48-9ce9-4ce76e329edf-kube-api-access-hflvb\") pod \"auto-csr-approver-29536222-m8lnm\" (UID: \"be5f726c-3d3b-4b48-9ce9-4ce76e329edf\") " pod="openshift-infra/auto-csr-approver-29536222-m8lnm" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.384131 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hflvb\" (UniqueName: \"kubernetes.io/projected/be5f726c-3d3b-4b48-9ce9-4ce76e329edf-kube-api-access-hflvb\") pod \"auto-csr-approver-29536222-m8lnm\" (UID: \"be5f726c-3d3b-4b48-9ce9-4ce76e329edf\") " pod="openshift-infra/auto-csr-approver-29536222-m8lnm" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.419991 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hflvb\" (UniqueName: \"kubernetes.io/projected/be5f726c-3d3b-4b48-9ce9-4ce76e329edf-kube-api-access-hflvb\") pod \"auto-csr-approver-29536222-m8lnm\" (UID: \"be5f726c-3d3b-4b48-9ce9-4ce76e329edf\") " pod="openshift-infra/auto-csr-approver-29536222-m8lnm" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.495805 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536222-m8lnm" Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.788839 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536222-m8lnm"] Feb 27 06:22:00 crc kubenswrapper[4725]: I0227 06:22:00.979187 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536222-m8lnm" event={"ID":"be5f726c-3d3b-4b48-9ce9-4ce76e329edf","Type":"ContainerStarted","Data":"ab92a297aa3bb03bbd55875b3c27cc23d9b6603c2b5e0a9115b7e8339f7cd1a9"} Feb 27 06:22:02 crc kubenswrapper[4725]: I0227 06:22:02.998603 4725 generic.go:334] "Generic (PLEG): container finished" podID="be5f726c-3d3b-4b48-9ce9-4ce76e329edf" containerID="9a6d2e987fbb19bf18c1c83af76ab8a75f1430f9a9e2ecfb2bd70b7b1a1ce096" exitCode=0 Feb 27 06:22:02 crc kubenswrapper[4725]: I0227 06:22:02.998679 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536222-m8lnm" event={"ID":"be5f726c-3d3b-4b48-9ce9-4ce76e329edf","Type":"ContainerDied","Data":"9a6d2e987fbb19bf18c1c83af76ab8a75f1430f9a9e2ecfb2bd70b7b1a1ce096"} Feb 27 06:22:04 crc kubenswrapper[4725]: I0227 06:22:04.327137 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536222-m8lnm" Feb 27 06:22:04 crc kubenswrapper[4725]: I0227 06:22:04.442904 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hflvb\" (UniqueName: \"kubernetes.io/projected/be5f726c-3d3b-4b48-9ce9-4ce76e329edf-kube-api-access-hflvb\") pod \"be5f726c-3d3b-4b48-9ce9-4ce76e329edf\" (UID: \"be5f726c-3d3b-4b48-9ce9-4ce76e329edf\") " Feb 27 06:22:04 crc kubenswrapper[4725]: I0227 06:22:04.451143 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5f726c-3d3b-4b48-9ce9-4ce76e329edf-kube-api-access-hflvb" (OuterVolumeSpecName: "kube-api-access-hflvb") pod "be5f726c-3d3b-4b48-9ce9-4ce76e329edf" (UID: "be5f726c-3d3b-4b48-9ce9-4ce76e329edf"). InnerVolumeSpecName "kube-api-access-hflvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:22:04 crc kubenswrapper[4725]: I0227 06:22:04.545549 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hflvb\" (UniqueName: \"kubernetes.io/projected/be5f726c-3d3b-4b48-9ce9-4ce76e329edf-kube-api-access-hflvb\") on node \"crc\" DevicePath \"\"" Feb 27 06:22:05 crc kubenswrapper[4725]: I0227 06:22:05.014808 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536222-m8lnm" event={"ID":"be5f726c-3d3b-4b48-9ce9-4ce76e329edf","Type":"ContainerDied","Data":"ab92a297aa3bb03bbd55875b3c27cc23d9b6603c2b5e0a9115b7e8339f7cd1a9"} Feb 27 06:22:05 crc kubenswrapper[4725]: I0227 06:22:05.014877 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536222-m8lnm" Feb 27 06:22:05 crc kubenswrapper[4725]: I0227 06:22:05.014884 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab92a297aa3bb03bbd55875b3c27cc23d9b6603c2b5e0a9115b7e8339f7cd1a9" Feb 27 06:22:05 crc kubenswrapper[4725]: I0227 06:22:05.398747 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536216-tw6gq"] Feb 27 06:22:05 crc kubenswrapper[4725]: I0227 06:22:05.405229 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536216-tw6gq"] Feb 27 06:22:06 crc kubenswrapper[4725]: I0227 06:22:06.279498 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13fdc97-d080-45c4-a03e-12e51a0c85bf" path="/var/lib/kubelet/pods/d13fdc97-d080-45c4-a03e-12e51a0c85bf/volumes" Feb 27 06:22:39 crc kubenswrapper[4725]: I0227 06:22:39.465369 4725 scope.go:117] "RemoveContainer" containerID="79ed38471fa17537cea18a8f940e4ebf006c264304ac1259edf5facf64740b87" Feb 27 06:23:02 crc kubenswrapper[4725]: I0227 06:23:02.554819 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:23:02 crc kubenswrapper[4725]: I0227 06:23:02.555685 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.666100 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl"] Feb 27 06:23:11 crc kubenswrapper[4725]: E0227 06:23:11.666939 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5f726c-3d3b-4b48-9ce9-4ce76e329edf" containerName="oc" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.666954 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5f726c-3d3b-4b48-9ce9-4ce76e329edf" containerName="oc" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.667095 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5f726c-3d3b-4b48-9ce9-4ce76e329edf" containerName="oc" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.667524 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.669963 4725 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hd889" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.671245 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-4dbtd"] Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.672070 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4dbtd" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.672444 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.672444 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.674786 4725 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gzcks" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.684982 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4dbtd"] Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.692122 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl"] Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.700856 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-frkvb"] Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.701589 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.704837 4725 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tnlh2" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.719894 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-frkvb"] Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.860998 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr8k7\" (UniqueName: \"kubernetes.io/projected/4c233d31-e0c7-4e39-9092-7df4e4b23c96-kube-api-access-pr8k7\") pod \"cert-manager-webhook-687f57d79b-frkvb\" (UID: \"4c233d31-e0c7-4e39-9092-7df4e4b23c96\") " pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.861375 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9dv\" (UniqueName: \"kubernetes.io/projected/1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9-kube-api-access-hg9dv\") pod \"cert-manager-858654f9db-4dbtd\" (UID: \"1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9\") " pod="cert-manager/cert-manager-858654f9db-4dbtd" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.861478 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76z9s\" (UniqueName: \"kubernetes.io/projected/17417e2f-0dcc-4720-8766-65a0d193ae26-kube-api-access-76z9s\") pod \"cert-manager-cainjector-cf98fcc89-ggzrl\" (UID: \"17417e2f-0dcc-4720-8766-65a0d193ae26\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.962169 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9dv\" (UniqueName: \"kubernetes.io/projected/1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9-kube-api-access-hg9dv\") pod \"cert-manager-858654f9db-4dbtd\" (UID: \"1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9\") " pod="cert-manager/cert-manager-858654f9db-4dbtd" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.962821 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76z9s\" (UniqueName: \"kubernetes.io/projected/17417e2f-0dcc-4720-8766-65a0d193ae26-kube-api-access-76z9s\") pod \"cert-manager-cainjector-cf98fcc89-ggzrl\" (UID: \"17417e2f-0dcc-4720-8766-65a0d193ae26\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" Feb 27 06:23:11 crc kubenswrapper[4725]: I0227 06:23:11.963031 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr8k7\" (UniqueName: \"kubernetes.io/projected/4c233d31-e0c7-4e39-9092-7df4e4b23c96-kube-api-access-pr8k7\") pod \"cert-manager-webhook-687f57d79b-frkvb\" (UID: \"4c233d31-e0c7-4e39-9092-7df4e4b23c96\") " pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.006639 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76z9s\" (UniqueName: \"kubernetes.io/projected/17417e2f-0dcc-4720-8766-65a0d193ae26-kube-api-access-76z9s\") pod \"cert-manager-cainjector-cf98fcc89-ggzrl\" (UID: \"17417e2f-0dcc-4720-8766-65a0d193ae26\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.006823 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9dv\" (UniqueName: \"kubernetes.io/projected/1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9-kube-api-access-hg9dv\") pod \"cert-manager-858654f9db-4dbtd\" (UID: \"1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9\") " pod="cert-manager/cert-manager-858654f9db-4dbtd" Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.009871 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr8k7\" (UniqueName: \"kubernetes.io/projected/4c233d31-e0c7-4e39-9092-7df4e4b23c96-kube-api-access-pr8k7\") pod \"cert-manager-webhook-687f57d79b-frkvb\" (UID: \"4c233d31-e0c7-4e39-9092-7df4e4b23c96\") " pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.027556 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.287238 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.295665 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4dbtd" Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.523946 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-frkvb"] Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.622835 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl"] Feb 27 06:23:12 crc kubenswrapper[4725]: W0227 06:23:12.632442 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17417e2f_0dcc_4720_8766_65a0d193ae26.slice/crio-f27b662bb36557ce4821ee219c7602a17fa6e59bf06350bdfef70b37e21205cf WatchSource:0}: Error finding container f27b662bb36557ce4821ee219c7602a17fa6e59bf06350bdfef70b37e21205cf: Status 404 returned error can't find the container with id f27b662bb36557ce4821ee219c7602a17fa6e59bf06350bdfef70b37e21205cf Feb 27 06:23:12 crc kubenswrapper[4725]: I0227 06:23:12.678673 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4dbtd"] Feb 27 06:23:12 crc kubenswrapper[4725]: W0227 06:23:12.681752 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9eb526_ea0d_4c3b_a6e8_309bee7c42f9.slice/crio-60242a10f14350e2d00a116af76457123572a992bc01c5c5aa4e8719551111c3 WatchSource:0}: Error finding container 60242a10f14350e2d00a116af76457123572a992bc01c5c5aa4e8719551111c3: Status 404 returned error can't find the container with id 60242a10f14350e2d00a116af76457123572a992bc01c5c5aa4e8719551111c3 Feb 27 06:23:13 crc kubenswrapper[4725]: I0227 06:23:13.527883 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" event={"ID":"17417e2f-0dcc-4720-8766-65a0d193ae26","Type":"ContainerStarted","Data":"f27b662bb36557ce4821ee219c7602a17fa6e59bf06350bdfef70b37e21205cf"} Feb 27 06:23:13 crc kubenswrapper[4725]: I0227 06:23:13.528646 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4dbtd" event={"ID":"1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9","Type":"ContainerStarted","Data":"60242a10f14350e2d00a116af76457123572a992bc01c5c5aa4e8719551111c3"} Feb 27 06:23:13 crc kubenswrapper[4725]: I0227 06:23:13.529648 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" event={"ID":"4c233d31-e0c7-4e39-9092-7df4e4b23c96","Type":"ContainerStarted","Data":"d5e901fa2f93bd1286dd716fe72f2566a887f17d5e419c23569d5c96fbd70fe9"} Feb 27 06:23:16 crc kubenswrapper[4725]: I0227 06:23:16.547945 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" event={"ID":"17417e2f-0dcc-4720-8766-65a0d193ae26","Type":"ContainerStarted","Data":"a5fb43c8eb4acbbe164f442e4cb4cb12636a1d436a266301c64fca59a78eaf6c"} Feb 27 06:23:16 crc kubenswrapper[4725]: I0227 06:23:16.550347 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4dbtd" event={"ID":"1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9","Type":"ContainerStarted","Data":"f055b5625f40c39b6af65762ff35a2de0f73f7f757c8e98dd0fbe7b52950b1a0"} Feb 27 06:23:16 crc kubenswrapper[4725]: I0227 06:23:16.552668 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" event={"ID":"4c233d31-e0c7-4e39-9092-7df4e4b23c96","Type":"ContainerStarted","Data":"575aa110e35d865a66723c366f3667cc9c1cd526dfab7ae2079d3344cd64fb3f"} Feb 27 06:23:16 crc kubenswrapper[4725]: I0227 06:23:16.553224 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" Feb 27 06:23:16 crc kubenswrapper[4725]: I0227 06:23:16.579325 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ggzrl" podStartSLOduration=2.326107766 podStartE2EDuration="5.579275239s" podCreationTimestamp="2026-02-27 06:23:11 +0000 UTC" firstStartedPulling="2026-02-27 06:23:12.636331761 +0000 UTC m=+771.098952330" lastFinishedPulling="2026-02-27 06:23:15.889499224 +0000 UTC m=+774.352119803" observedRunningTime="2026-02-27 06:23:16.569266272 +0000 UTC m=+775.031886851" watchObservedRunningTime="2026-02-27 06:23:16.579275239 +0000 UTC m=+775.041895848" Feb 27 06:23:16 crc kubenswrapper[4725]: I0227 06:23:16.598623 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-4dbtd" podStartSLOduration=2.300734265 podStartE2EDuration="5.598607514s" podCreationTimestamp="2026-02-27 06:23:11 +0000 UTC" firstStartedPulling="2026-02-27 06:23:12.684056571 +0000 UTC m=+771.146677140" lastFinishedPulling="2026-02-27 06:23:15.98192977 +0000 UTC m=+774.444550389" observedRunningTime="2026-02-27 06:23:16.597642947 +0000 UTC m=+775.060263516" watchObservedRunningTime="2026-02-27 06:23:16.598607514 +0000 UTC m=+775.061228073" Feb 27 06:23:16 crc kubenswrapper[4725]: I0227 06:23:16.628265 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" podStartSLOduration=2.285286238 podStartE2EDuration="5.628239183s" podCreationTimestamp="2026-02-27 06:23:11 +0000 UTC" firstStartedPulling="2026-02-27 06:23:12.546524968 +0000 UTC m=+771.009145537" lastFinishedPulling="2026-02-27 06:23:15.889477873 +0000 UTC m=+774.352098482" observedRunningTime="2026-02-27 06:23:16.620818058 +0000 UTC m=+775.083438627" watchObservedRunningTime="2026-02-27 06:23:16.628239183 +0000 UTC m=+775.090859762" Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.687757 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lchm9"] Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.688916 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-controller" containerID="cri-o://7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" gracePeriod=30 Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.688980 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="nbdb" containerID="cri-o://9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" gracePeriod=30 Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.689158 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="northd" containerID="cri-o://4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" gracePeriod=30 Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.689238 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" gracePeriod=30 Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.689311 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-node" containerID="cri-o://56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" gracePeriod=30 Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.689372 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-acl-logging" containerID="cri-o://24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" gracePeriod=30 Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.689581 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="sbdb" containerID="cri-o://0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" gracePeriod=30 Feb 27 06:23:21 crc kubenswrapper[4725]: I0227 06:23:21.773985 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" containerID="cri-o://e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" gracePeriod=30 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.031448 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-frkvb" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.090711 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/3.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.093424 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovn-acl-logging/0.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.093946 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovn-controller/0.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.094462 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113051 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-bin\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113080 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-slash\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113101 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-config\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113124 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-ovn-kubernetes\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113144 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-log-socket\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113143 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113182 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05a446dc-e501-4173-a911-7b33ca4608c6-ovn-node-metrics-cert\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113170 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-slash" (OuterVolumeSpecName: "host-slash") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113205 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-log-socket" (OuterVolumeSpecName: "log-socket") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113314 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113275 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113345 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-openvswitch\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113379 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113417 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113466 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-systemd\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113517 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-netd\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113554 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbt4v\" (UniqueName: \"kubernetes.io/projected/05a446dc-e501-4173-a911-7b33ca4608c6-kube-api-access-sbt4v\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113591 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-node-log\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113599 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113647 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-systemd-units\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113687 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-var-lib-openvswitch\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113739 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-netns\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113809 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-env-overrides\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113862 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-kubelet\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113909 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-script-lib\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113939 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-etc-openvswitch\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.113968 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-ovn\") pod \"05a446dc-e501-4173-a911-7b33ca4608c6\" (UID: \"05a446dc-e501-4173-a911-7b33ca4608c6\") " Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114148 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114189 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114211 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114327 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-node-log" (OuterVolumeSpecName: "node-log") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114341 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114483 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114656 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114650 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.114860 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.115094 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117438 4725 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117478 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117500 4725 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117518 4725 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117535 4725 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-slash\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117552 4725 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117570 4725 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117588 4725 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117607 4725 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-log-socket\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117627 4725 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117645 4725 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117663 4725 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117680 4725 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-node-log\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117697 4725 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117714 4725 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117732 4725 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.117751 4725 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/05a446dc-e501-4173-a911-7b33ca4608c6-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.120904 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a446dc-e501-4173-a911-7b33ca4608c6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.121024 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a446dc-e501-4173-a911-7b33ca4608c6-kube-api-access-sbt4v" (OuterVolumeSpecName: "kube-api-access-sbt4v") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "kube-api-access-sbt4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.134208 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "05a446dc-e501-4173-a911-7b33ca4608c6" (UID: "05a446dc-e501-4173-a911-7b33ca4608c6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.172888 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wrk2b"] Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173157 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173172 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173182 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173188 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173198 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173204 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173214 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="sbdb" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173219 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="sbdb" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173230 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kubecfg-setup" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173236 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kubecfg-setup" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173243 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-acl-logging" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173250 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-acl-logging" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173257 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-node" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173263 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-node" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173271 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="northd" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173276 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="northd" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173297 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173303 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173312 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="nbdb" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173318 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="nbdb" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173328 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173334 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173427 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173439 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="nbdb" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173447 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173452 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="sbdb" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173461 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173468 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="northd" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173476 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173483 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="kube-rbac-proxy-node" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173490 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173498 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovn-acl-logging" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173507 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173585 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173594 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173681 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.173766 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.173773 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" containerName="ovnkube-controller" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.177711 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219102 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-systemd-units\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219158 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219191 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-kubelet\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219225 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2wkn\" (UniqueName: \"kubernetes.io/projected/384b2947-2df2-439f-9a8f-676cb9052fbe-kube-api-access-r2wkn\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219247 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-ovnkube-script-lib\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219280 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-ovnkube-config\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219400 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/384b2947-2df2-439f-9a8f-676cb9052fbe-ovn-node-metrics-cert\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219440 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-cni-bin\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219468 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-run-netns\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219487 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-log-socket\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219507 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-systemd\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219533 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-ovn\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219712 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-etc-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219759 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-cni-netd\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.219882 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-slash\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220095 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220155 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-var-lib-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220195 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-node-log\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220217 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220241 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-env-overrides\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220356 4725 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/05a446dc-e501-4173-a911-7b33ca4608c6-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220376 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbt4v\" (UniqueName: \"kubernetes.io/projected/05a446dc-e501-4173-a911-7b33ca4608c6-kube-api-access-sbt4v\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.220400 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/05a446dc-e501-4173-a911-7b33ca4608c6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.321349 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/384b2947-2df2-439f-9a8f-676cb9052fbe-ovn-node-metrics-cert\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.321955 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-cni-bin\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322079 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-cni-bin\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322102 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-run-netns\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322225 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-log-socket\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322266 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-systemd\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322346 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-ovn\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322377 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-log-socket\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322409 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-etc-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322493 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-systemd\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322540 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-cni-netd\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322539 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-ovn\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322627 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-cni-netd\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322519 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-etc-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322653 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-slash\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322629 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-slash\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322725 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322791 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-var-lib-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-run-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-var-lib-openvswitch\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322863 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-node-log\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322906 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.323079 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-env-overrides\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322939 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-node-log\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.323201 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-run-netns\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.322986 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.324366 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-env-overrides\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.324500 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-systemd-units\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.324596 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-systemd-units\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.324662 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.324738 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.324886 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-kubelet\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.324926 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/384b2947-2df2-439f-9a8f-676cb9052fbe-host-kubelet\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.325006 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-ovnkube-script-lib\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.325985 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-ovnkube-script-lib\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.326072 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2wkn\" (UniqueName: \"kubernetes.io/projected/384b2947-2df2-439f-9a8f-676cb9052fbe-kube-api-access-r2wkn\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.326653 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-ovnkube-config\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.327734 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/384b2947-2df2-439f-9a8f-676cb9052fbe-ovn-node-metrics-cert\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.327785 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/384b2947-2df2-439f-9a8f-676cb9052fbe-ovnkube-config\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.358712 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2wkn\" (UniqueName: \"kubernetes.io/projected/384b2947-2df2-439f-9a8f-676cb9052fbe-kube-api-access-r2wkn\") pod \"ovnkube-node-wrk2b\" (UID: \"384b2947-2df2-439f-9a8f-676cb9052fbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.502757 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:22 crc kubenswrapper[4725]: W0227 06:23:22.535532 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod384b2947_2df2_439f_9a8f_676cb9052fbe.slice/crio-e3d728d07820412510af665fbbd48813563f92485614eae6b57655973bb9dba7 WatchSource:0}: Error finding container e3d728d07820412510af665fbbd48813563f92485614eae6b57655973bb9dba7: Status 404 returned error can't find the container with id e3d728d07820412510af665fbbd48813563f92485614eae6b57655973bb9dba7 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.604235 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"e3d728d07820412510af665fbbd48813563f92485614eae6b57655973bb9dba7"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.608201 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovnkube-controller/3.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.612462 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovn-acl-logging/0.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.613265 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lchm9_05a446dc-e501-4173-a911-7b33ca4608c6/ovn-controller/0.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614016 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" exitCode=0 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614057 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" exitCode=0 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614139 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" exitCode=0 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614169 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" exitCode=0 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614184 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" exitCode=0 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614197 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" exitCode=0 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614210 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" exitCode=143 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614224 4725 generic.go:334] "Generic (PLEG): container finished" podID="05a446dc-e501-4173-a911-7b33ca4608c6" containerID="7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" exitCode=143 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614124 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614363 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614389 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614409 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614428 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614489 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614541 4725 scope.go:117] "RemoveContainer" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614673 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614693 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614705 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614716 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614727 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614737 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614748 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614759 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614770 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614786 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614803 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614815 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614826 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614838 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614849 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614860 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614871 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614882 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614893 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614903 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614919 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614935 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614947 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614959 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614969 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614980 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.614990 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615001 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615011 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615022 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615032 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615077 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" event={"ID":"05a446dc-e501-4173-a911-7b33ca4608c6","Type":"ContainerDied","Data":"35e552ee2754b532b808590a888e54469cd21b2f24d9cd0ffc830180f60958a9"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615093 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615105 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615116 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615127 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615138 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615149 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615158 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615169 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615180 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.615191 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.618209 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lchm9" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.620080 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/2.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.620785 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/1.log" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.620992 4725 generic.go:334] "Generic (PLEG): container finished" podID="7439e599-9b13-45e6-8f71-ef3760b2235b" containerID="e0f1f8817193a70ed97103d3c02f10cc027d4cd5706eedd949743fced02e5989" exitCode=2 Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.621121 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerDied","Data":"e0f1f8817193a70ed97103d3c02f10cc027d4cd5706eedd949743fced02e5989"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.621196 4725 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd"} Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.622101 4725 scope.go:117] "RemoveContainer" containerID="e0f1f8817193a70ed97103d3c02f10cc027d4cd5706eedd949743fced02e5989" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.623001 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-g8jqm_openshift-multus(7439e599-9b13-45e6-8f71-ef3760b2235b)\"" pod="openshift-multus/multus-g8jqm" podUID="7439e599-9b13-45e6-8f71-ef3760b2235b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.681549 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.700460 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lchm9"] Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.708206 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lchm9"] Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.712536 4725 scope.go:117] "RemoveContainer" containerID="0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.733873 4725 scope.go:117] "RemoveContainer" containerID="9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.805146 4725 scope.go:117] "RemoveContainer" containerID="4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.825969 4725 scope.go:117] "RemoveContainer" containerID="d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.858257 4725 scope.go:117] "RemoveContainer" containerID="56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.894371 4725 scope.go:117] "RemoveContainer" containerID="24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.913633 4725 scope.go:117] "RemoveContainer" containerID="7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.940259 4725 scope.go:117] "RemoveContainer" containerID="e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.962973 4725 scope.go:117] "RemoveContainer" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.963888 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": container with ID starting with e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6 not found: ID does not exist" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.963985 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} err="failed to get container status \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": rpc error: code = NotFound desc = could not find container \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": container with ID starting with e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.964031 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.964554 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": container with ID starting with 795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d not found: ID does not exist" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.964685 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} err="failed to get container status \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": rpc error: code = NotFound desc = could not find container \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": container with ID starting with 795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.964823 4725 scope.go:117] "RemoveContainer" containerID="0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.965712 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": container with ID starting with 0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9 not found: ID does not exist" containerID="0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.965788 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} err="failed to get container status \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": rpc error: code = NotFound desc = could not find container \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": container with ID starting with 0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.965840 4725 scope.go:117] "RemoveContainer" containerID="9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.966469 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": container with ID starting with 9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2 not found: ID does not exist" containerID="9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.966588 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} err="failed to get container status \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": rpc error: code = NotFound desc = could not find container \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": container with ID starting with 9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.966696 4725 scope.go:117] "RemoveContainer" containerID="4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.967219 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": container with ID starting with 4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620 not found: ID does not exist" containerID="4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.967378 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} err="failed to get container status \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": rpc error: code = NotFound desc = could not find container \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": container with ID starting with 4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.967476 4725 scope.go:117] "RemoveContainer" containerID="d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.968141 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": container with ID starting with d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80 not found: ID does not exist" containerID="d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.968196 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} err="failed to get container status \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": rpc error: code = NotFound desc = could not find container \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": container with ID starting with d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.968228 4725 scope.go:117] "RemoveContainer" containerID="56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.968771 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": container with ID starting with 56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a not found: ID does not exist" containerID="56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.968823 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} err="failed to get container status \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": rpc error: code = NotFound desc = could not find container \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": container with ID starting with 56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.968894 4725 scope.go:117] "RemoveContainer" containerID="24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.969627 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": container with ID starting with 24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b not found: ID does not exist" containerID="24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.969710 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} err="failed to get container status \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": rpc error: code = NotFound desc = could not find container \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": container with ID starting with 24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.969748 4725 scope.go:117] "RemoveContainer" containerID="7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.970545 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": container with ID starting with 7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4 not found: ID does not exist" containerID="7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.970592 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} err="failed to get container status \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": rpc error: code = NotFound desc = could not find container \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": container with ID starting with 7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.970627 4725 scope.go:117] "RemoveContainer" containerID="e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed" Feb 27 06:23:22 crc kubenswrapper[4725]: E0227 06:23:22.971096 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": container with ID starting with e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed not found: ID does not exist" containerID="e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.971148 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} err="failed to get container status \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": rpc error: code = NotFound desc = could not find container \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": container with ID starting with e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.971179 4725 scope.go:117] "RemoveContainer" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.971575 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} err="failed to get container status \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": rpc error: code = NotFound desc = could not find container \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": container with ID starting with e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.971621 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.972013 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} err="failed to get container status \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": rpc error: code = NotFound desc = could not find container \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": container with ID starting with 795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.972058 4725 scope.go:117] "RemoveContainer" containerID="0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.972575 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} err="failed to get container status \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": rpc error: code = NotFound desc = could not find container \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": container with ID starting with 0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.972617 4725 scope.go:117] "RemoveContainer" containerID="9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.973026 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} err="failed to get container status \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": rpc error: code = NotFound desc = could not find container \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": container with ID starting with 9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.973065 4725 scope.go:117] "RemoveContainer" containerID="4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.973541 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} err="failed to get container status \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": rpc error: code = NotFound desc = could not find container \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": container with ID starting with 4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.973602 4725 scope.go:117] "RemoveContainer" containerID="d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.974116 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} err="failed to get container status \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": rpc error: code = NotFound desc = could not find container \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": container with ID starting with d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.974175 4725 scope.go:117] "RemoveContainer" containerID="56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.974530 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} err="failed to get container status \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": rpc error: code = NotFound desc = could not find container \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": container with ID starting with 56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.974578 4725 scope.go:117] "RemoveContainer" containerID="24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.974896 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} err="failed to get container status \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": rpc error: code = NotFound desc = could not find container \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": container with ID starting with 24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.974934 4725 scope.go:117] "RemoveContainer" containerID="7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.975345 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} err="failed to get container status \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": rpc error: code = NotFound desc = could not find container \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": container with ID starting with 7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.975390 4725 scope.go:117] "RemoveContainer" containerID="e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.975689 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} err="failed to get container status \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": rpc error: code = NotFound desc = could not find container \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": container with ID starting with e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.975724 4725 scope.go:117] "RemoveContainer" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.976073 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} err="failed to get container status \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": rpc error: code = NotFound desc = could not find container \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": container with ID starting with e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.976111 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.976518 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} err="failed to get container status \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": rpc error: code = NotFound desc = could not find container \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": container with ID starting with 795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.976546 4725 scope.go:117] "RemoveContainer" containerID="0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.976920 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} err="failed to get container status \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": rpc error: code = NotFound desc = could not find container \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": container with ID starting with 0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.976957 4725 scope.go:117] "RemoveContainer" containerID="9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.977538 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} err="failed to get container status \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": rpc error: code = NotFound desc = could not find container \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": container with ID starting with 9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.977588 4725 scope.go:117] "RemoveContainer" containerID="4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.978150 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} err="failed to get container status \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": rpc error: code = NotFound desc = could not find container \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": container with ID starting with 4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.978193 4725 scope.go:117] "RemoveContainer" containerID="d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.978548 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} err="failed to get container status \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": rpc error: code = NotFound desc = could not find container \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": container with ID starting with d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.978585 4725 scope.go:117] "RemoveContainer" containerID="56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.978939 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} err="failed to get container status \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": rpc error: code = NotFound desc = could not find container \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": container with ID starting with 56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.978975 4725 scope.go:117] "RemoveContainer" containerID="24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.979919 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} err="failed to get container status \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": rpc error: code = NotFound desc = could not find container \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": container with ID starting with 24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.980155 4725 scope.go:117] "RemoveContainer" containerID="7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.980597 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} err="failed to get container status \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": rpc error: code = NotFound desc = could not find container \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": container with ID starting with 7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.980638 4725 scope.go:117] "RemoveContainer" containerID="e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.980984 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} err="failed to get container status \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": rpc error: code = NotFound desc = could not find container \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": container with ID starting with e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.981018 4725 scope.go:117] "RemoveContainer" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.981414 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} err="failed to get container status \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": rpc error: code = NotFound desc = could not find container \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": container with ID starting with e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.981457 4725 scope.go:117] "RemoveContainer" containerID="795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.981777 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d"} err="failed to get container status \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": rpc error: code = NotFound desc = could not find container \"795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d\": container with ID starting with 795a89a708a5eff63455957bf44c0c3e5d22536d2de2ca94b15399844af5c80d not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.981812 4725 scope.go:117] "RemoveContainer" containerID="0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.982131 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9"} err="failed to get container status \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": rpc error: code = NotFound desc = could not find container \"0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9\": container with ID starting with 0f6bddd7e8156250a82d48c887aff7197ade4ea59e1776c8abb239e2a6a97db9 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.982166 4725 scope.go:117] "RemoveContainer" containerID="9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.983496 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2"} err="failed to get container status \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": rpc error: code = NotFound desc = could not find container \"9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2\": container with ID starting with 9b3a0bf8dec21e6dffb8adcd3e7c9e2b620f3d2fb6d54c806b1cc817bffed0c2 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.983542 4725 scope.go:117] "RemoveContainer" containerID="4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.983954 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620"} err="failed to get container status \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": rpc error: code = NotFound desc = could not find container \"4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620\": container with ID starting with 4bda3fe73b64b16d9c40f3c8089f40da3237167fd38a5faeb3ce310ec745e620 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.983994 4725 scope.go:117] "RemoveContainer" containerID="d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.984456 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80"} err="failed to get container status \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": rpc error: code = NotFound desc = could not find container \"d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80\": container with ID starting with d60af73940faa7baece8e045b8d12a5337916298e77b0a3c3aa211d3a9c85b80 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.984496 4725 scope.go:117] "RemoveContainer" containerID="56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.984832 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a"} err="failed to get container status \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": rpc error: code = NotFound desc = could not find container \"56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a\": container with ID starting with 56c21fe256a66dc5b294d9f61a88ff627d5b98390ccc26f6cea5935bd9064f2a not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.984869 4725 scope.go:117] "RemoveContainer" containerID="24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.985249 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b"} err="failed to get container status \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": rpc error: code = NotFound desc = could not find container \"24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b\": container with ID starting with 24afba3e8dcc79d07b41b66ea05ece81a980343d2b914d7dcbd2f67e7895ea4b not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.985310 4725 scope.go:117] "RemoveContainer" containerID="7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.985614 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4"} err="failed to get container status \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": rpc error: code = NotFound desc = could not find container \"7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4\": container with ID starting with 7ae733a823ad608c8bd32baa35b75ca63ebaff0e6339a06d73f51c8e3b599fb4 not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.985653 4725 scope.go:117] "RemoveContainer" containerID="e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.986069 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed"} err="failed to get container status \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": rpc error: code = NotFound desc = could not find container \"e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed\": container with ID starting with e19ab55ab098c3f707f6d4e2325a4b25b437c80df4bcafcbfe0a7cc0448d51ed not found: ID does not exist" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.986109 4725 scope.go:117] "RemoveContainer" containerID="e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6" Feb 27 06:23:22 crc kubenswrapper[4725]: I0227 06:23:22.986482 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6"} err="failed to get container status \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": rpc error: code = NotFound desc = could not find container \"e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6\": container with ID starting with e029239c541b53beb997f1f5585a39ad874eff0cf21f7ef3cdda0df85f618eb6 not found: ID does not exist" Feb 27 06:23:23 crc kubenswrapper[4725]: I0227 06:23:23.632979 4725 generic.go:334] "Generic (PLEG): container finished" podID="384b2947-2df2-439f-9a8f-676cb9052fbe" containerID="e84b68a9e0b7d858104bb20231c75397bb2ec42629a84cc660797aaad56f626d" exitCode=0 Feb 27 06:23:23 crc kubenswrapper[4725]: I0227 06:23:23.633064 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerDied","Data":"e84b68a9e0b7d858104bb20231c75397bb2ec42629a84cc660797aaad56f626d"} Feb 27 06:23:24 crc kubenswrapper[4725]: I0227 06:23:24.263794 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a446dc-e501-4173-a911-7b33ca4608c6" path="/var/lib/kubelet/pods/05a446dc-e501-4173-a911-7b33ca4608c6/volumes" Feb 27 06:23:24 crc kubenswrapper[4725]: I0227 06:23:24.652497 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"fd6e6d28dc8cca95d4de415e4cfd841ba15e701d0ca966922658e45031b9af1c"} Feb 27 06:23:24 crc kubenswrapper[4725]: I0227 06:23:24.652565 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"c5b984ebcda8a8fc5600c6a89faa1a5392eaa1c3a1478cee95c09949759d98a7"} Feb 27 06:23:24 crc kubenswrapper[4725]: I0227 06:23:24.652584 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"24355c83c6cd7243c0796ae73ae0aded48f464ec48c34db24961b3d0a86201f6"} Feb 27 06:23:24 crc kubenswrapper[4725]: I0227 06:23:24.652605 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"ff696f83274b8ebe287f60e5bee2271d3ab20e0d2379bb2cb82f6b28f9124a86"} Feb 27 06:23:24 crc kubenswrapper[4725]: I0227 06:23:24.652621 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"a596e1e55deb304451a4c3c9fcfed80d2ce2cf2e1f5db2b022ca6077be25ee0a"} Feb 27 06:23:24 crc kubenswrapper[4725]: I0227 06:23:24.652633 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"401013a2c510aedd1be319b4ac234cac23415f6360b0e252d5d1ba3e28485e81"} Feb 27 06:23:27 crc kubenswrapper[4725]: I0227 06:23:27.699358 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"0cb254550766b83747aae9a1a18f4ad3334b7f8b3eba87e1e71ee56c3468727e"} Feb 27 06:23:29 crc kubenswrapper[4725]: I0227 06:23:29.720110 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" event={"ID":"384b2947-2df2-439f-9a8f-676cb9052fbe","Type":"ContainerStarted","Data":"850c5219ddef964dde6f7a7b89aba55c10c6200783de9b11b925dbda3ef1d09c"} Feb 27 06:23:29 crc kubenswrapper[4725]: I0227 06:23:29.720837 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:29 crc kubenswrapper[4725]: I0227 06:23:29.756752 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" podStartSLOduration=7.756728587 podStartE2EDuration="7.756728587s" podCreationTimestamp="2026-02-27 06:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:23:29.751730549 +0000 UTC m=+788.214351148" watchObservedRunningTime="2026-02-27 06:23:29.756728587 +0000 UTC m=+788.219349156" Feb 27 06:23:29 crc kubenswrapper[4725]: I0227 06:23:29.757028 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:30 crc kubenswrapper[4725]: I0227 06:23:30.728601 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:30 crc kubenswrapper[4725]: I0227 06:23:30.729086 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:30 crc kubenswrapper[4725]: I0227 06:23:30.770520 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:23:32 crc kubenswrapper[4725]: I0227 06:23:32.554193 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:23:32 crc kubenswrapper[4725]: I0227 06:23:32.554764 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:23:36 crc kubenswrapper[4725]: I0227 06:23:36.251601 4725 scope.go:117] "RemoveContainer" containerID="e0f1f8817193a70ed97103d3c02f10cc027d4cd5706eedd949743fced02e5989" Feb 27 06:23:36 crc kubenswrapper[4725]: E0227 06:23:36.252384 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-g8jqm_openshift-multus(7439e599-9b13-45e6-8f71-ef3760b2235b)\"" pod="openshift-multus/multus-g8jqm" podUID="7439e599-9b13-45e6-8f71-ef3760b2235b" Feb 27 06:23:39 crc kubenswrapper[4725]: I0227 06:23:39.532110 4725 scope.go:117] "RemoveContainer" containerID="275a3d87d3b5cd09fa5a3b43dbffa35c61527d844362e8ca775d5699e92ec8fd" Feb 27 06:23:39 crc kubenswrapper[4725]: I0227 06:23:39.806472 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/2.log" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.268258 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj"] Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.270564 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.273371 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.290280 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj"] Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.381480 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.381556 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.381652 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g82gp\" (UniqueName: \"kubernetes.io/projected/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-kube-api-access-g82gp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.483267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.483327 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.483367 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g82gp\" (UniqueName: \"kubernetes.io/projected/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-kube-api-access-g82gp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.484127 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.485545 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.518282 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g82gp\" (UniqueName: \"kubernetes.io/projected/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-kube-api-access-g82gp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.599563 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.631930 4725 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(8b89fc8e247f81a05496a04bd0db74577d31c046580c78f7906ad881f62a894e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.632078 4725 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(8b89fc8e247f81a05496a04bd0db74577d31c046580c78f7906ad881f62a894e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.632129 4725 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(8b89fc8e247f81a05496a04bd0db74577d31c046580c78f7906ad881f62a894e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.632236 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace(935bf92a-4c1e-47c1-a2e2-cd49a6db1b93)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace(935bf92a-4c1e-47c1-a2e2-cd49a6db1b93)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(8b89fc8e247f81a05496a04bd0db74577d31c046580c78f7906ad881f62a894e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.886681 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: I0227 06:23:49.887217 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.923534 4725 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(17e6c8584317a21e8d91b764ea7e08998b35739dc01618eed77387daa207749e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.923638 4725 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(17e6c8584317a21e8d91b764ea7e08998b35739dc01618eed77387daa207749e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.923674 4725 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(17e6c8584317a21e8d91b764ea7e08998b35739dc01618eed77387daa207749e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:23:49 crc kubenswrapper[4725]: E0227 06:23:49.923749 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace(935bf92a-4c1e-47c1-a2e2-cd49a6db1b93)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace(935bf92a-4c1e-47c1-a2e2-cd49a6db1b93)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_openshift-marketplace_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93_0(17e6c8584317a21e8d91b764ea7e08998b35739dc01618eed77387daa207749e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" Feb 27 06:23:51 crc kubenswrapper[4725]: I0227 06:23:51.251757 4725 scope.go:117] "RemoveContainer" containerID="e0f1f8817193a70ed97103d3c02f10cc027d4cd5706eedd949743fced02e5989" Feb 27 06:23:51 crc kubenswrapper[4725]: I0227 06:23:51.907249 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g8jqm_7439e599-9b13-45e6-8f71-ef3760b2235b/kube-multus/2.log" Feb 27 06:23:51 crc kubenswrapper[4725]: I0227 06:23:51.907822 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g8jqm" event={"ID":"7439e599-9b13-45e6-8f71-ef3760b2235b","Type":"ContainerStarted","Data":"bfca8811c805e6dd6bc5401deed410af42856d6e490efc1480a57d236ddf06ad"} Feb 27 06:23:52 crc kubenswrapper[4725]: I0227 06:23:52.532351 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrk2b" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.147863 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536224-m5jcf"] Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.149505 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536224-m5jcf" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.152550 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.153384 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.154100 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.163191 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536224-m5jcf"] Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.254650 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlpph\" (UniqueName: \"kubernetes.io/projected/fab9bd04-3665-4720-8cf9-9fb9cf78a016-kube-api-access-vlpph\") pod \"auto-csr-approver-29536224-m5jcf\" (UID: \"fab9bd04-3665-4720-8cf9-9fb9cf78a016\") " pod="openshift-infra/auto-csr-approver-29536224-m5jcf" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.356442 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlpph\" (UniqueName: \"kubernetes.io/projected/fab9bd04-3665-4720-8cf9-9fb9cf78a016-kube-api-access-vlpph\") pod \"auto-csr-approver-29536224-m5jcf\" (UID: \"fab9bd04-3665-4720-8cf9-9fb9cf78a016\") " pod="openshift-infra/auto-csr-approver-29536224-m5jcf" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.391773 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlpph\" (UniqueName: \"kubernetes.io/projected/fab9bd04-3665-4720-8cf9-9fb9cf78a016-kube-api-access-vlpph\") pod \"auto-csr-approver-29536224-m5jcf\" (UID: \"fab9bd04-3665-4720-8cf9-9fb9cf78a016\") " pod="openshift-infra/auto-csr-approver-29536224-m5jcf" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.484005 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536224-m5jcf" Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.750169 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536224-m5jcf"] Feb 27 06:24:00 crc kubenswrapper[4725]: I0227 06:24:00.980696 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536224-m5jcf" event={"ID":"fab9bd04-3665-4720-8cf9-9fb9cf78a016","Type":"ContainerStarted","Data":"79f79edc1f0ee25c4e259064d03eed0181858b18d54251d23b2414daf0448d76"} Feb 27 06:24:01 crc kubenswrapper[4725]: I0227 06:24:01.251504 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:24:01 crc kubenswrapper[4725]: I0227 06:24:01.252716 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:24:01 crc kubenswrapper[4725]: I0227 06:24:01.523678 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj"] Feb 27 06:24:01 crc kubenswrapper[4725]: W0227 06:24:01.536652 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935bf92a_4c1e_47c1_a2e2_cd49a6db1b93.slice/crio-d81d64dc6a6a6538afc6c58e9c55bf3d093f4fe06e517c29668d68c122dc483e WatchSource:0}: Error finding container d81d64dc6a6a6538afc6c58e9c55bf3d093f4fe06e517c29668d68c122dc483e: Status 404 returned error can't find the container with id d81d64dc6a6a6538afc6c58e9c55bf3d093f4fe06e517c29668d68c122dc483e Feb 27 06:24:01 crc kubenswrapper[4725]: I0227 06:24:01.988749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" event={"ID":"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93","Type":"ContainerStarted","Data":"d3d80cd8ff17ce07e7cdb54ead32d35843295e1bd77e216d4d783c7c8111531a"} Feb 27 06:24:01 crc kubenswrapper[4725]: I0227 06:24:01.988803 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" event={"ID":"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93","Type":"ContainerStarted","Data":"d81d64dc6a6a6538afc6c58e9c55bf3d093f4fe06e517c29668d68c122dc483e"} Feb 27 06:24:02 crc kubenswrapper[4725]: I0227 06:24:02.554463 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:24:02 crc kubenswrapper[4725]: I0227 06:24:02.555093 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:24:02 crc kubenswrapper[4725]: I0227 06:24:02.555171 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:24:02 crc kubenswrapper[4725]: I0227 06:24:02.556261 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1f293a213b8036dbb26c658eb7bfe019d4be19b3467bb55e8ef79040281b13b"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:24:02 crc kubenswrapper[4725]: I0227 06:24:02.556406 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://f1f293a213b8036dbb26c658eb7bfe019d4be19b3467bb55e8ef79040281b13b" gracePeriod=600 Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.001736 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="f1f293a213b8036dbb26c658eb7bfe019d4be19b3467bb55e8ef79040281b13b" exitCode=0 Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.001798 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"f1f293a213b8036dbb26c658eb7bfe019d4be19b3467bb55e8ef79040281b13b"} Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.002422 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"03dc8bea10798b61bde03e0e8912868ddc55a9db35d9f15615b091af21e96406"} Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.002446 4725 scope.go:117] "RemoveContainer" containerID="3de280f968967bd9a16dbc4f0f0a7eae44829783c4f2138d5e7a2fbcb79954ed" Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.013196 4725 generic.go:334] "Generic (PLEG): container finished" podID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerID="d3d80cd8ff17ce07e7cdb54ead32d35843295e1bd77e216d4d783c7c8111531a" exitCode=0 Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.013318 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" event={"ID":"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93","Type":"ContainerDied","Data":"d3d80cd8ff17ce07e7cdb54ead32d35843295e1bd77e216d4d783c7c8111531a"} Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.025026 4725 generic.go:334] "Generic (PLEG): container finished" podID="fab9bd04-3665-4720-8cf9-9fb9cf78a016" containerID="9fd2166b2f3cc3ee52a6ffcd3510fc2d804af69ad0cd03808ce4cca8dc90b614" exitCode=0 Feb 27 06:24:03 crc kubenswrapper[4725]: I0227 06:24:03.025115 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536224-m5jcf" event={"ID":"fab9bd04-3665-4720-8cf9-9fb9cf78a016","Type":"ContainerDied","Data":"9fd2166b2f3cc3ee52a6ffcd3510fc2d804af69ad0cd03808ce4cca8dc90b614"} Feb 27 06:24:04 crc kubenswrapper[4725]: I0227 06:24:04.385700 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536224-m5jcf" Feb 27 06:24:04 crc kubenswrapper[4725]: I0227 06:24:04.513056 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlpph\" (UniqueName: \"kubernetes.io/projected/fab9bd04-3665-4720-8cf9-9fb9cf78a016-kube-api-access-vlpph\") pod \"fab9bd04-3665-4720-8cf9-9fb9cf78a016\" (UID: \"fab9bd04-3665-4720-8cf9-9fb9cf78a016\") " Feb 27 06:24:04 crc kubenswrapper[4725]: I0227 06:24:04.524067 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab9bd04-3665-4720-8cf9-9fb9cf78a016-kube-api-access-vlpph" (OuterVolumeSpecName: "kube-api-access-vlpph") pod "fab9bd04-3665-4720-8cf9-9fb9cf78a016" (UID: "fab9bd04-3665-4720-8cf9-9fb9cf78a016"). InnerVolumeSpecName "kube-api-access-vlpph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:24:04 crc kubenswrapper[4725]: I0227 06:24:04.614666 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlpph\" (UniqueName: \"kubernetes.io/projected/fab9bd04-3665-4720-8cf9-9fb9cf78a016-kube-api-access-vlpph\") on node \"crc\" DevicePath \"\"" Feb 27 06:24:05 crc kubenswrapper[4725]: I0227 06:24:05.051572 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536224-m5jcf" Feb 27 06:24:05 crc kubenswrapper[4725]: I0227 06:24:05.051568 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536224-m5jcf" event={"ID":"fab9bd04-3665-4720-8cf9-9fb9cf78a016","Type":"ContainerDied","Data":"79f79edc1f0ee25c4e259064d03eed0181858b18d54251d23b2414daf0448d76"} Feb 27 06:24:05 crc kubenswrapper[4725]: I0227 06:24:05.051897 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79f79edc1f0ee25c4e259064d03eed0181858b18d54251d23b2414daf0448d76" Feb 27 06:24:05 crc kubenswrapper[4725]: I0227 06:24:05.056862 4725 generic.go:334] "Generic (PLEG): container finished" podID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerID="6ba5f08b20e36b6d63b1f1cf32ecd5be04ac336cdbe7bf38b441d2908a16cb77" exitCode=0 Feb 27 06:24:05 crc kubenswrapper[4725]: I0227 06:24:05.056945 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" event={"ID":"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93","Type":"ContainerDied","Data":"6ba5f08b20e36b6d63b1f1cf32ecd5be04ac336cdbe7bf38b441d2908a16cb77"} Feb 27 06:24:05 crc kubenswrapper[4725]: I0227 06:24:05.471957 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536218-dqxfn"] Feb 27 06:24:05 crc kubenswrapper[4725]: I0227 06:24:05.476511 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536218-dqxfn"] Feb 27 06:24:06 crc kubenswrapper[4725]: I0227 06:24:06.068849 4725 generic.go:334] "Generic (PLEG): container finished" podID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerID="f140f0fdc21e0db2a692b2605afd81aeceb7a41221ca9132ddddf4ac68b3db76" exitCode=0 Feb 27 06:24:06 crc kubenswrapper[4725]: I0227 06:24:06.068919 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" event={"ID":"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93","Type":"ContainerDied","Data":"f140f0fdc21e0db2a692b2605afd81aeceb7a41221ca9132ddddf4ac68b3db76"} Feb 27 06:24:06 crc kubenswrapper[4725]: I0227 06:24:06.264600 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7839ac25-2229-4c28-afd8-f8f6e997a018" path="/var/lib/kubelet/pods/7839ac25-2229-4c28-afd8-f8f6e997a018/volumes" Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.409830 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.452605 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-util\") pod \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.452705 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g82gp\" (UniqueName: \"kubernetes.io/projected/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-kube-api-access-g82gp\") pod \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.452762 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-bundle\") pod \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\" (UID: \"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93\") " Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.459923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-bundle" (OuterVolumeSpecName: "bundle") pod "935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" (UID: "935bf92a-4c1e-47c1-a2e2-cd49a6db1b93"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.465142 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-kube-api-access-g82gp" (OuterVolumeSpecName: "kube-api-access-g82gp") pod "935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" (UID: "935bf92a-4c1e-47c1-a2e2-cd49a6db1b93"). InnerVolumeSpecName "kube-api-access-g82gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.480122 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-util" (OuterVolumeSpecName: "util") pod "935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" (UID: "935bf92a-4c1e-47c1-a2e2-cd49a6db1b93"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.553893 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g82gp\" (UniqueName: \"kubernetes.io/projected/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-kube-api-access-g82gp\") on node \"crc\" DevicePath \"\"" Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.553932 4725 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:24:07 crc kubenswrapper[4725]: I0227 06:24:07.553943 4725 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/935bf92a-4c1e-47c1-a2e2-cd49a6db1b93-util\") on node \"crc\" DevicePath \"\"" Feb 27 06:24:08 crc kubenswrapper[4725]: I0227 06:24:08.089597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" event={"ID":"935bf92a-4c1e-47c1-a2e2-cd49a6db1b93","Type":"ContainerDied","Data":"d81d64dc6a6a6538afc6c58e9c55bf3d093f4fe06e517c29668d68c122dc483e"} Feb 27 06:24:08 crc kubenswrapper[4725]: I0227 06:24:08.089705 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d81d64dc6a6a6538afc6c58e9c55bf3d093f4fe06e517c29668d68c122dc483e" Feb 27 06:24:08 crc kubenswrapper[4725]: I0227 06:24:08.089665 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj" Feb 27 06:24:10 crc kubenswrapper[4725]: I0227 06:24:10.873448 4725 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.375700 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6"] Feb 27 06:24:17 crc kubenswrapper[4725]: E0227 06:24:17.376617 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab9bd04-3665-4720-8cf9-9fb9cf78a016" containerName="oc" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.376637 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab9bd04-3665-4720-8cf9-9fb9cf78a016" containerName="oc" Feb 27 06:24:17 crc kubenswrapper[4725]: E0227 06:24:17.376658 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerName="pull" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.376671 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerName="pull" Feb 27 06:24:17 crc kubenswrapper[4725]: E0227 06:24:17.376688 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerName="util" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.376700 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerName="util" Feb 27 06:24:17 crc kubenswrapper[4725]: E0227 06:24:17.376717 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerName="extract" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.376729 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerName="extract" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.376895 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="935bf92a-4c1e-47c1-a2e2-cd49a6db1b93" containerName="extract" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.376921 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab9bd04-3665-4720-8cf9-9fb9cf78a016" containerName="oc" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.377539 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.379982 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-w2vjb" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.380172 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.382562 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.383641 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.477379 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm84g\" (UniqueName: \"kubernetes.io/projected/9fefe362-2058-4721-930e-9651059cfcc8-kube-api-access-bm84g\") pod \"obo-prometheus-operator-68bc856cb9-vpgt6\" (UID: \"9fefe362-2058-4721-930e-9651059cfcc8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.488225 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.488835 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.491439 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-jkxph" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.491669 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.500874 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.501476 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.517338 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.519658 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.578311 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f50c85e-bec7-4a58-9317-b86b3ba5e02c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj\" (UID: \"0f50c85e-bec7-4a58-9317-b86b3ba5e02c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.578376 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e243b80-5980-459f-ba42-90ebdd42e05b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74\" (UID: \"7e243b80-5980-459f-ba42-90ebdd42e05b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.578433 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e243b80-5980-459f-ba42-90ebdd42e05b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74\" (UID: \"7e243b80-5980-459f-ba42-90ebdd42e05b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.578472 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm84g\" (UniqueName: \"kubernetes.io/projected/9fefe362-2058-4721-930e-9651059cfcc8-kube-api-access-bm84g\") pod \"obo-prometheus-operator-68bc856cb9-vpgt6\" (UID: \"9fefe362-2058-4721-930e-9651059cfcc8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.578492 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f50c85e-bec7-4a58-9317-b86b3ba5e02c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj\" (UID: \"0f50c85e-bec7-4a58-9317-b86b3ba5e02c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.599333 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm84g\" (UniqueName: \"kubernetes.io/projected/9fefe362-2058-4721-930e-9651059cfcc8-kube-api-access-bm84g\") pod \"obo-prometheus-operator-68bc856cb9-vpgt6\" (UID: \"9fefe362-2058-4721-930e-9651059cfcc8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.679649 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e243b80-5980-459f-ba42-90ebdd42e05b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74\" (UID: \"7e243b80-5980-459f-ba42-90ebdd42e05b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.679703 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e243b80-5980-459f-ba42-90ebdd42e05b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74\" (UID: \"7e243b80-5980-459f-ba42-90ebdd42e05b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.679768 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f50c85e-bec7-4a58-9317-b86b3ba5e02c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj\" (UID: \"0f50c85e-bec7-4a58-9317-b86b3ba5e02c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.680576 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f50c85e-bec7-4a58-9317-b86b3ba5e02c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj\" (UID: \"0f50c85e-bec7-4a58-9317-b86b3ba5e02c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.683431 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f50c85e-bec7-4a58-9317-b86b3ba5e02c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj\" (UID: \"0f50c85e-bec7-4a58-9317-b86b3ba5e02c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.683807 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e243b80-5980-459f-ba42-90ebdd42e05b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74\" (UID: \"7e243b80-5980-459f-ba42-90ebdd42e05b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.687899 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f50c85e-bec7-4a58-9317-b86b3ba5e02c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj\" (UID: \"0f50c85e-bec7-4a58-9317-b86b3ba5e02c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.696444 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e243b80-5980-459f-ba42-90ebdd42e05b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74\" (UID: \"7e243b80-5980-459f-ba42-90ebdd42e05b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.698299 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.712207 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-x6b52"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.720522 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.723915 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-nplwh" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.724183 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.752598 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-x6b52"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.783099 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjzr\" (UniqueName: \"kubernetes.io/projected/5810e280-be69-4236-9014-d459c65bd287-kube-api-access-kpjzr\") pod \"observability-operator-59bdc8b94-x6b52\" (UID: \"5810e280-be69-4236-9014-d459c65bd287\") " pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.783169 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5810e280-be69-4236-9014-d459c65bd287-observability-operator-tls\") pod \"observability-operator-59bdc8b94-x6b52\" (UID: \"5810e280-be69-4236-9014-d459c65bd287\") " pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.807931 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.826645 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.887841 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5810e280-be69-4236-9014-d459c65bd287-observability-operator-tls\") pod \"observability-operator-59bdc8b94-x6b52\" (UID: \"5810e280-be69-4236-9014-d459c65bd287\") " pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.887906 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjzr\" (UniqueName: \"kubernetes.io/projected/5810e280-be69-4236-9014-d459c65bd287-kube-api-access-kpjzr\") pod \"observability-operator-59bdc8b94-x6b52\" (UID: \"5810e280-be69-4236-9014-d459c65bd287\") " pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.896474 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5810e280-be69-4236-9014-d459c65bd287-observability-operator-tls\") pod \"observability-operator-59bdc8b94-x6b52\" (UID: \"5810e280-be69-4236-9014-d459c65bd287\") " pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.927955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjzr\" (UniqueName: \"kubernetes.io/projected/5810e280-be69-4236-9014-d459c65bd287-kube-api-access-kpjzr\") pod \"observability-operator-59bdc8b94-x6b52\" (UID: \"5810e280-be69-4236-9014-d459c65bd287\") " pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.962274 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9h7hk"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.963012 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.969619 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-97l4g" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.985466 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9h7hk"] Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.989359 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwldx\" (UniqueName: \"kubernetes.io/projected/0c2b0104-f94a-4e8a-bcd0-464ac8942f54-kube-api-access-wwldx\") pod \"perses-operator-5bf474d74f-9h7hk\" (UID: \"0c2b0104-f94a-4e8a-bcd0-464ac8942f54\") " pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:17 crc kubenswrapper[4725]: I0227 06:24:17.989436 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c2b0104-f94a-4e8a-bcd0-464ac8942f54-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9h7hk\" (UID: \"0c2b0104-f94a-4e8a-bcd0-464ac8942f54\") " pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.091923 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwldx\" (UniqueName: \"kubernetes.io/projected/0c2b0104-f94a-4e8a-bcd0-464ac8942f54-kube-api-access-wwldx\") pod \"perses-operator-5bf474d74f-9h7hk\" (UID: \"0c2b0104-f94a-4e8a-bcd0-464ac8942f54\") " pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.092001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c2b0104-f94a-4e8a-bcd0-464ac8942f54-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9h7hk\" (UID: \"0c2b0104-f94a-4e8a-bcd0-464ac8942f54\") " pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.093148 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c2b0104-f94a-4e8a-bcd0-464ac8942f54-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9h7hk\" (UID: \"0c2b0104-f94a-4e8a-bcd0-464ac8942f54\") " pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.112115 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwldx\" (UniqueName: \"kubernetes.io/projected/0c2b0104-f94a-4e8a-bcd0-464ac8942f54-kube-api-access-wwldx\") pod \"perses-operator-5bf474d74f-9h7hk\" (UID: \"0c2b0104-f94a-4e8a-bcd0-464ac8942f54\") " pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.127203 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.287111 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.330909 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-x6b52"] Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.334794 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6"] Feb 27 06:24:18 crc kubenswrapper[4725]: W0227 06:24:18.340038 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5810e280_be69_4236_9014_d459c65bd287.slice/crio-272d8bcff578956fda5c3ece051a39fd5f42ea2c9dcaccb4c6f58b60bab94133 WatchSource:0}: Error finding container 272d8bcff578956fda5c3ece051a39fd5f42ea2c9dcaccb4c6f58b60bab94133: Status 404 returned error can't find the container with id 272d8bcff578956fda5c3ece051a39fd5f42ea2c9dcaccb4c6f58b60bab94133 Feb 27 06:24:18 crc kubenswrapper[4725]: W0227 06:24:18.341260 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fefe362_2058_4721_930e_9651059cfcc8.slice/crio-06d87890a55bd75eac7f951a3d677ecac4bad23baf6758b8a3d16b31cf9318df WatchSource:0}: Error finding container 06d87890a55bd75eac7f951a3d677ecac4bad23baf6758b8a3d16b31cf9318df: Status 404 returned error can't find the container with id 06d87890a55bd75eac7f951a3d677ecac4bad23baf6758b8a3d16b31cf9318df Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.398106 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj"] Feb 27 06:24:18 crc kubenswrapper[4725]: W0227 06:24:18.411990 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f50c85e_bec7_4a58_9317_b86b3ba5e02c.slice/crio-3dc346a79d096e41eeb2f3f966a59d3f3f33bc6f82feeae3125bb55830c3e8cd WatchSource:0}: Error finding container 3dc346a79d096e41eeb2f3f966a59d3f3f33bc6f82feeae3125bb55830c3e8cd: Status 404 returned error can't find the container with id 3dc346a79d096e41eeb2f3f966a59d3f3f33bc6f82feeae3125bb55830c3e8cd Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.426508 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74"] Feb 27 06:24:18 crc kubenswrapper[4725]: I0227 06:24:18.492058 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9h7hk"] Feb 27 06:24:18 crc kubenswrapper[4725]: W0227 06:24:18.499695 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c2b0104_f94a_4e8a_bcd0_464ac8942f54.slice/crio-27f62c63eed50d5f6143e9b44c076e8ebd2fd3a33b9dbe495f42602eba0cf4cb WatchSource:0}: Error finding container 27f62c63eed50d5f6143e9b44c076e8ebd2fd3a33b9dbe495f42602eba0cf4cb: Status 404 returned error can't find the container with id 27f62c63eed50d5f6143e9b44c076e8ebd2fd3a33b9dbe495f42602eba0cf4cb Feb 27 06:24:19 crc kubenswrapper[4725]: I0227 06:24:19.151950 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" event={"ID":"0f50c85e-bec7-4a58-9317-b86b3ba5e02c","Type":"ContainerStarted","Data":"3dc346a79d096e41eeb2f3f966a59d3f3f33bc6f82feeae3125bb55830c3e8cd"} Feb 27 06:24:19 crc kubenswrapper[4725]: I0227 06:24:19.153801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" event={"ID":"9fefe362-2058-4721-930e-9651059cfcc8","Type":"ContainerStarted","Data":"06d87890a55bd75eac7f951a3d677ecac4bad23baf6758b8a3d16b31cf9318df"} Feb 27 06:24:19 crc kubenswrapper[4725]: I0227 06:24:19.155000 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" event={"ID":"0c2b0104-f94a-4e8a-bcd0-464ac8942f54","Type":"ContainerStarted","Data":"27f62c63eed50d5f6143e9b44c076e8ebd2fd3a33b9dbe495f42602eba0cf4cb"} Feb 27 06:24:19 crc kubenswrapper[4725]: I0227 06:24:19.156186 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" event={"ID":"7e243b80-5980-459f-ba42-90ebdd42e05b","Type":"ContainerStarted","Data":"10415a2cf89241eb74b3837acf06f6d3eceaf2f68a61a5b8aa2df74aead4445d"} Feb 27 06:24:19 crc kubenswrapper[4725]: I0227 06:24:19.157483 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-x6b52" event={"ID":"5810e280-be69-4236-9014-d459c65bd287","Type":"ContainerStarted","Data":"272d8bcff578956fda5c3ece051a39fd5f42ea2c9dcaccb4c6f58b60bab94133"} Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.260049 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" event={"ID":"0f50c85e-bec7-4a58-9317-b86b3ba5e02c","Type":"ContainerStarted","Data":"e1b90e827be166b13edfca6bbc49f972c18132a26ab66e73ca45d5696e0dcb23"} Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.262217 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" event={"ID":"9fefe362-2058-4721-930e-9651059cfcc8","Type":"ContainerStarted","Data":"3b7705f4ae1176b8112b3d834da4e75dc23df28e042b186bc4ad3ce694f924bc"} Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.264229 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" event={"ID":"0c2b0104-f94a-4e8a-bcd0-464ac8942f54","Type":"ContainerStarted","Data":"ccbc53894aa24e783284f0d7ab1b9c9d1754a67bc0a2d27562e1176f9ff44ec5"} Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.264360 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.266838 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" event={"ID":"7e243b80-5980-459f-ba42-90ebdd42e05b","Type":"ContainerStarted","Data":"2ada8d487ae85ebd7b7b2ef943928335120d4525c6a9e68187ac308f2311dd9e"} Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.269038 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-x6b52" event={"ID":"5810e280-be69-4236-9014-d459c65bd287","Type":"ContainerStarted","Data":"f144841bda2b4efd2161b2bef2466ce208dfd94b317482278691e11047b9b552"} Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.269309 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.278447 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj" podStartSLOduration=2.147494299 podStartE2EDuration="11.278437931s" podCreationTimestamp="2026-02-27 06:24:17 +0000 UTC" firstStartedPulling="2026-02-27 06:24:18.420143657 +0000 UTC m=+836.882764226" lastFinishedPulling="2026-02-27 06:24:27.551087289 +0000 UTC m=+846.013707858" observedRunningTime="2026-02-27 06:24:28.277514665 +0000 UTC m=+846.740135254" watchObservedRunningTime="2026-02-27 06:24:28.278437931 +0000 UTC m=+846.741058500" Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.289877 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-x6b52" Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.311088 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-vpgt6" podStartSLOduration=2.143533503 podStartE2EDuration="11.31106557s" podCreationTimestamp="2026-02-27 06:24:17 +0000 UTC" firstStartedPulling="2026-02-27 06:24:18.349083021 +0000 UTC m=+836.811703590" lastFinishedPulling="2026-02-27 06:24:27.516615088 +0000 UTC m=+845.979235657" observedRunningTime="2026-02-27 06:24:28.2972179 +0000 UTC m=+846.759838509" watchObservedRunningTime="2026-02-27 06:24:28.31106557 +0000 UTC m=+846.773686169" Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.361070 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74" podStartSLOduration=2.287079488 podStartE2EDuration="11.361051387s" podCreationTimestamp="2026-02-27 06:24:17 +0000 UTC" firstStartedPulling="2026-02-27 06:24:18.442132135 +0000 UTC m=+836.904752704" lastFinishedPulling="2026-02-27 06:24:27.516104034 +0000 UTC m=+845.978724603" observedRunningTime="2026-02-27 06:24:28.33630962 +0000 UTC m=+846.798930229" watchObservedRunningTime="2026-02-27 06:24:28.361051387 +0000 UTC m=+846.823671956" Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.396090 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-x6b52" podStartSLOduration=2.197931263 podStartE2EDuration="11.396072723s" podCreationTimestamp="2026-02-27 06:24:17 +0000 UTC" firstStartedPulling="2026-02-27 06:24:18.348967237 +0000 UTC m=+836.811587806" lastFinishedPulling="2026-02-27 06:24:27.547108707 +0000 UTC m=+846.009729266" observedRunningTime="2026-02-27 06:24:28.392701048 +0000 UTC m=+846.855321617" watchObservedRunningTime="2026-02-27 06:24:28.396072723 +0000 UTC m=+846.858693302" Feb 27 06:24:28 crc kubenswrapper[4725]: I0227 06:24:28.396787 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" podStartSLOduration=2.383338785 podStartE2EDuration="11.396782453s" podCreationTimestamp="2026-02-27 06:24:17 +0000 UTC" firstStartedPulling="2026-02-27 06:24:18.502595014 +0000 UTC m=+836.965215583" lastFinishedPulling="2026-02-27 06:24:27.516038692 +0000 UTC m=+845.978659251" observedRunningTime="2026-02-27 06:24:28.366208112 +0000 UTC m=+846.828828701" watchObservedRunningTime="2026-02-27 06:24:28.396782453 +0000 UTC m=+846.859403022" Feb 27 06:24:38 crc kubenswrapper[4725]: I0227 06:24:38.291080 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-9h7hk" Feb 27 06:24:39 crc kubenswrapper[4725]: I0227 06:24:39.610270 4725 scope.go:117] "RemoveContainer" containerID="ddbf53cf5ff95e919b248f533f46b56c2c853b0f5bb658de3f2bda113960db36" Feb 27 06:24:55 crc kubenswrapper[4725]: I0227 06:24:55.750806 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd"] Feb 27 06:24:55 crc kubenswrapper[4725]: I0227 06:24:55.752429 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:55 crc kubenswrapper[4725]: I0227 06:24:55.754997 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 06:24:55 crc kubenswrapper[4725]: I0227 06:24:55.763731 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd"] Feb 27 06:24:55 crc kubenswrapper[4725]: I0227 06:24:55.904682 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:55 crc kubenswrapper[4725]: I0227 06:24:55.904744 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:55 crc kubenswrapper[4725]: I0227 06:24:55.904774 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bftbm\" (UniqueName: \"kubernetes.io/projected/b14dab93-a732-4bc5-8ecc-cc49b374669f-kube-api-access-bftbm\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.007085 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.007527 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.007634 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bftbm\" (UniqueName: \"kubernetes.io/projected/b14dab93-a732-4bc5-8ecc-cc49b374669f-kube-api-access-bftbm\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.008493 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.008544 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.043862 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bftbm\" (UniqueName: \"kubernetes.io/projected/b14dab93-a732-4bc5-8ecc-cc49b374669f-kube-api-access-bftbm\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.114454 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.420426 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd"] Feb 27 06:24:56 crc kubenswrapper[4725]: I0227 06:24:56.455818 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" event={"ID":"b14dab93-a732-4bc5-8ecc-cc49b374669f","Type":"ContainerStarted","Data":"8126b06236097dba522536dc4d55caa399737339fc8e8bc1a1ef58fe759a6fc8"} Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.464832 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qcfsz"] Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.467443 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.477106 4725 generic.go:334] "Generic (PLEG): container finished" podID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerID="73af406e6bddcc7244daffa4b21b1d024c3627c782b01824360788ed0e2082cb" exitCode=0 Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.477179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" event={"ID":"b14dab93-a732-4bc5-8ecc-cc49b374669f","Type":"ContainerDied","Data":"73af406e6bddcc7244daffa4b21b1d024c3627c782b01824360788ed0e2082cb"} Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.499582 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcfsz"] Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.632924 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-utilities\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.633446 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwct5\" (UniqueName: \"kubernetes.io/projected/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-kube-api-access-rwct5\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.633576 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-catalog-content\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.735137 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-utilities\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.735235 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwct5\" (UniqueName: \"kubernetes.io/projected/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-kube-api-access-rwct5\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.735257 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-catalog-content\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.735703 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-catalog-content\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.735906 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-utilities\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.767858 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwct5\" (UniqueName: \"kubernetes.io/projected/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-kube-api-access-rwct5\") pod \"redhat-operators-qcfsz\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:57 crc kubenswrapper[4725]: I0227 06:24:57.828570 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:24:58 crc kubenswrapper[4725]: I0227 06:24:58.118621 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcfsz"] Feb 27 06:24:58 crc kubenswrapper[4725]: W0227 06:24:58.120463 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde0504e9_97ee_4ce8_b5a4_a5d7c22199b1.slice/crio-7a238a22e8289454e690314e0064cd02fbd9e50788a129f3c38fb31a1442e5a2 WatchSource:0}: Error finding container 7a238a22e8289454e690314e0064cd02fbd9e50788a129f3c38fb31a1442e5a2: Status 404 returned error can't find the container with id 7a238a22e8289454e690314e0064cd02fbd9e50788a129f3c38fb31a1442e5a2 Feb 27 06:24:58 crc kubenswrapper[4725]: I0227 06:24:58.485239 4725 generic.go:334] "Generic (PLEG): container finished" podID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerID="01707ca5e45644c627654219557cc6088fe49d8cab878bc2f10448ae67244fad" exitCode=0 Feb 27 06:24:58 crc kubenswrapper[4725]: I0227 06:24:58.485326 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcfsz" event={"ID":"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1","Type":"ContainerDied","Data":"01707ca5e45644c627654219557cc6088fe49d8cab878bc2f10448ae67244fad"} Feb 27 06:24:58 crc kubenswrapper[4725]: I0227 06:24:58.485361 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcfsz" event={"ID":"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1","Type":"ContainerStarted","Data":"7a238a22e8289454e690314e0064cd02fbd9e50788a129f3c38fb31a1442e5a2"} Feb 27 06:24:59 crc kubenswrapper[4725]: I0227 06:24:59.494431 4725 generic.go:334] "Generic (PLEG): container finished" podID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerID="b83fbf83e041d763c27b5f816099f5c8d880ffdf5aa41c0eb5f8d6e52740fb0c" exitCode=0 Feb 27 06:24:59 crc kubenswrapper[4725]: I0227 06:24:59.494533 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" event={"ID":"b14dab93-a732-4bc5-8ecc-cc49b374669f","Type":"ContainerDied","Data":"b83fbf83e041d763c27b5f816099f5c8d880ffdf5aa41c0eb5f8d6e52740fb0c"} Feb 27 06:24:59 crc kubenswrapper[4725]: I0227 06:24:59.497082 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcfsz" event={"ID":"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1","Type":"ContainerStarted","Data":"24f24019a113f9965f2b3cf71b44a4fb2aefff1e25abd001ef0d50e0359d27ca"} Feb 27 06:25:00 crc kubenswrapper[4725]: I0227 06:25:00.509514 4725 generic.go:334] "Generic (PLEG): container finished" podID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerID="24f24019a113f9965f2b3cf71b44a4fb2aefff1e25abd001ef0d50e0359d27ca" exitCode=0 Feb 27 06:25:00 crc kubenswrapper[4725]: I0227 06:25:00.509578 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcfsz" event={"ID":"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1","Type":"ContainerDied","Data":"24f24019a113f9965f2b3cf71b44a4fb2aefff1e25abd001ef0d50e0359d27ca"} Feb 27 06:25:00 crc kubenswrapper[4725]: I0227 06:25:00.513794 4725 generic.go:334] "Generic (PLEG): container finished" podID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerID="8aa1d0991a774c57885ae20e4117a1e10ffb8484ef8e422d69ce8a79569352e7" exitCode=0 Feb 27 06:25:00 crc kubenswrapper[4725]: I0227 06:25:00.513876 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" event={"ID":"b14dab93-a732-4bc5-8ecc-cc49b374669f","Type":"ContainerDied","Data":"8aa1d0991a774c57885ae20e4117a1e10ffb8484ef8e422d69ce8a79569352e7"} Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.522811 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcfsz" event={"ID":"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1","Type":"ContainerStarted","Data":"ef08b98531a14e60b8ce7e009111c03f8473246c04a0e4ddbc9fd3b3e383d272"} Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.547703 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qcfsz" podStartSLOduration=2.103728165 podStartE2EDuration="4.547683857s" podCreationTimestamp="2026-02-27 06:24:57 +0000 UTC" firstStartedPulling="2026-02-27 06:24:58.487232944 +0000 UTC m=+876.949853513" lastFinishedPulling="2026-02-27 06:25:00.931188606 +0000 UTC m=+879.393809205" observedRunningTime="2026-02-27 06:25:01.545168166 +0000 UTC m=+880.007788765" watchObservedRunningTime="2026-02-27 06:25:01.547683857 +0000 UTC m=+880.010304426" Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.787116 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.894632 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-bundle\") pod \"b14dab93-a732-4bc5-8ecc-cc49b374669f\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.894746 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bftbm\" (UniqueName: \"kubernetes.io/projected/b14dab93-a732-4bc5-8ecc-cc49b374669f-kube-api-access-bftbm\") pod \"b14dab93-a732-4bc5-8ecc-cc49b374669f\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.894953 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-util\") pod \"b14dab93-a732-4bc5-8ecc-cc49b374669f\" (UID: \"b14dab93-a732-4bc5-8ecc-cc49b374669f\") " Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.895631 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-bundle" (OuterVolumeSpecName: "bundle") pod "b14dab93-a732-4bc5-8ecc-cc49b374669f" (UID: "b14dab93-a732-4bc5-8ecc-cc49b374669f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.896858 4725 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.908336 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14dab93-a732-4bc5-8ecc-cc49b374669f-kube-api-access-bftbm" (OuterVolumeSpecName: "kube-api-access-bftbm") pod "b14dab93-a732-4bc5-8ecc-cc49b374669f" (UID: "b14dab93-a732-4bc5-8ecc-cc49b374669f"). InnerVolumeSpecName "kube-api-access-bftbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.927421 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-util" (OuterVolumeSpecName: "util") pod "b14dab93-a732-4bc5-8ecc-cc49b374669f" (UID: "b14dab93-a732-4bc5-8ecc-cc49b374669f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.998543 4725 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b14dab93-a732-4bc5-8ecc-cc49b374669f-util\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:01 crc kubenswrapper[4725]: I0227 06:25:01.998573 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bftbm\" (UniqueName: \"kubernetes.io/projected/b14dab93-a732-4bc5-8ecc-cc49b374669f-kube-api-access-bftbm\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:02 crc kubenswrapper[4725]: I0227 06:25:02.534528 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" Feb 27 06:25:02 crc kubenswrapper[4725]: I0227 06:25:02.534668 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd" event={"ID":"b14dab93-a732-4bc5-8ecc-cc49b374669f","Type":"ContainerDied","Data":"8126b06236097dba522536dc4d55caa399737339fc8e8bc1a1ef58fe759a6fc8"} Feb 27 06:25:02 crc kubenswrapper[4725]: I0227 06:25:02.535008 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8126b06236097dba522536dc4d55caa399737339fc8e8bc1a1ef58fe759a6fc8" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.170161 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q"] Feb 27 06:25:06 crc kubenswrapper[4725]: E0227 06:25:06.170973 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerName="extract" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.170989 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerName="extract" Feb 27 06:25:06 crc kubenswrapper[4725]: E0227 06:25:06.171000 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerName="util" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.171006 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerName="util" Feb 27 06:25:06 crc kubenswrapper[4725]: E0227 06:25:06.171016 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerName="pull" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.171026 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerName="pull" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.171166 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14dab93-a732-4bc5-8ecc-cc49b374669f" containerName="extract" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.171728 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.174447 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.175451 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.176914 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bnl86" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.179171 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q"] Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.263831 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9g7\" (UniqueName: \"kubernetes.io/projected/c0b93a17-8e40-4f49-94c7-cf241342c7be-kube-api-access-2s9g7\") pod \"nmstate-operator-75c5dccd6c-hgq5q\" (UID: \"c0b93a17-8e40-4f49-94c7-cf241342c7be\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.366213 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9g7\" (UniqueName: \"kubernetes.io/projected/c0b93a17-8e40-4f49-94c7-cf241342c7be-kube-api-access-2s9g7\") pod \"nmstate-operator-75c5dccd6c-hgq5q\" (UID: \"c0b93a17-8e40-4f49-94c7-cf241342c7be\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.396193 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9g7\" (UniqueName: \"kubernetes.io/projected/c0b93a17-8e40-4f49-94c7-cf241342c7be-kube-api-access-2s9g7\") pod \"nmstate-operator-75c5dccd6c-hgq5q\" (UID: \"c0b93a17-8e40-4f49-94c7-cf241342c7be\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.545119 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.806515 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q"] Feb 27 06:25:06 crc kubenswrapper[4725]: I0227 06:25:06.821606 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:25:07 crc kubenswrapper[4725]: I0227 06:25:07.566941 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" event={"ID":"c0b93a17-8e40-4f49-94c7-cf241342c7be","Type":"ContainerStarted","Data":"c1a7360cd9cd343f6767e334ee958ba93123184d76c316009717f73d16556255"} Feb 27 06:25:07 crc kubenswrapper[4725]: I0227 06:25:07.828905 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:25:07 crc kubenswrapper[4725]: I0227 06:25:07.829486 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:25:08 crc kubenswrapper[4725]: I0227 06:25:08.912364 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qcfsz" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="registry-server" probeResult="failure" output=< Feb 27 06:25:08 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:25:08 crc kubenswrapper[4725]: > Feb 27 06:25:10 crc kubenswrapper[4725]: I0227 06:25:10.586042 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" event={"ID":"c0b93a17-8e40-4f49-94c7-cf241342c7be","Type":"ContainerStarted","Data":"6d1f13a9acd976b3ab7ec567bef467d89000119a7f201923be27848914ecbda3"} Feb 27 06:25:10 crc kubenswrapper[4725]: I0227 06:25:10.611708 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hgq5q" podStartSLOduration=1.998822833 podStartE2EDuration="4.611688821s" podCreationTimestamp="2026-02-27 06:25:06 +0000 UTC" firstStartedPulling="2026-02-27 06:25:06.821342494 +0000 UTC m=+885.283963063" lastFinishedPulling="2026-02-27 06:25:09.434208482 +0000 UTC m=+887.896829051" observedRunningTime="2026-02-27 06:25:10.606790193 +0000 UTC m=+889.069410772" watchObservedRunningTime="2026-02-27 06:25:10.611688821 +0000 UTC m=+889.074309400" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.299646 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-bdf7n"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.305520 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.309569 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pp55s" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.317151 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-bdf7n"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.330585 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdcqh\" (UniqueName: \"kubernetes.io/projected/73608f57-a852-439f-82b8-364a37b0e88c-kube-api-access-mdcqh\") pod \"nmstate-metrics-69594cc75-bdf7n\" (UID: \"73608f57-a852-439f-82b8-364a37b0e88c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.345871 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-z2kll"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.346852 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.359441 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-6sx27"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.372858 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.377061 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.394593 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-6sx27"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.432941 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-ovs-socket\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.433024 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7b2eb7dd-736f-4c16-8630-ed2a8607e094-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-6sx27\" (UID: \"7b2eb7dd-736f-4c16-8630-ed2a8607e094\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.433102 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-dbus-socket\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.433304 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvz78\" (UniqueName: \"kubernetes.io/projected/3f5fe9d7-0289-42a1-a991-3d0285038f72-kube-api-access-qvz78\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.433381 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdcqh\" (UniqueName: \"kubernetes.io/projected/73608f57-a852-439f-82b8-364a37b0e88c-kube-api-access-mdcqh\") pod \"nmstate-metrics-69594cc75-bdf7n\" (UID: \"73608f57-a852-439f-82b8-364a37b0e88c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.433433 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-nmstate-lock\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.433487 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77jfs\" (UniqueName: \"kubernetes.io/projected/7b2eb7dd-736f-4c16-8630-ed2a8607e094-kube-api-access-77jfs\") pod \"nmstate-webhook-786f45cff4-6sx27\" (UID: \"7b2eb7dd-736f-4c16-8630-ed2a8607e094\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.473207 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.474203 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.478117 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdcqh\" (UniqueName: \"kubernetes.io/projected/73608f57-a852-439f-82b8-364a37b0e88c-kube-api-access-mdcqh\") pod \"nmstate-metrics-69594cc75-bdf7n\" (UID: \"73608f57-a852-439f-82b8-364a37b0e88c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.479697 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.480203 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.489973 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7scrd" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.497606 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.534666 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9stz5\" (UniqueName: \"kubernetes.io/projected/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-kube-api-access-9stz5\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.534721 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7b2eb7dd-736f-4c16-8630-ed2a8607e094-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-6sx27\" (UID: \"7b2eb7dd-736f-4c16-8630-ed2a8607e094\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.534755 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.534783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-dbus-socket\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: E0227 06:25:17.535042 4725 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535215 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-dbus-socket\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535225 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvz78\" (UniqueName: \"kubernetes.io/projected/3f5fe9d7-0289-42a1-a991-3d0285038f72-kube-api-access-qvz78\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535392 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-nmstate-lock\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535455 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77jfs\" (UniqueName: \"kubernetes.io/projected/7b2eb7dd-736f-4c16-8630-ed2a8607e094-kube-api-access-77jfs\") pod \"nmstate-webhook-786f45cff4-6sx27\" (UID: \"7b2eb7dd-736f-4c16-8630-ed2a8607e094\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535523 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535609 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-ovs-socket\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535752 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-ovs-socket\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: E0227 06:25:17.535776 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b2eb7dd-736f-4c16-8630-ed2a8607e094-tls-key-pair podName:7b2eb7dd-736f-4c16-8630-ed2a8607e094 nodeName:}" failed. No retries permitted until 2026-02-27 06:25:18.035744007 +0000 UTC m=+896.498364576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/7b2eb7dd-736f-4c16-8630-ed2a8607e094-tls-key-pair") pod "nmstate-webhook-786f45cff4-6sx27" (UID: "7b2eb7dd-736f-4c16-8630-ed2a8607e094") : secret "openshift-nmstate-webhook" not found Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.535799 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3f5fe9d7-0289-42a1-a991-3d0285038f72-nmstate-lock\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.564321 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvz78\" (UniqueName: \"kubernetes.io/projected/3f5fe9d7-0289-42a1-a991-3d0285038f72-kube-api-access-qvz78\") pod \"nmstate-handler-z2kll\" (UID: \"3f5fe9d7-0289-42a1-a991-3d0285038f72\") " pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.564421 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77jfs\" (UniqueName: \"kubernetes.io/projected/7b2eb7dd-736f-4c16-8630-ed2a8607e094-kube-api-access-77jfs\") pod \"nmstate-webhook-786f45cff4-6sx27\" (UID: \"7b2eb7dd-736f-4c16-8630-ed2a8607e094\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.626866 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.637322 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.637382 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9stz5\" (UniqueName: \"kubernetes.io/projected/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-kube-api-access-9stz5\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.637431 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.639164 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.642339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.661263 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9stz5\" (UniqueName: \"kubernetes.io/projected/450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea-kube-api-access-9stz5\") pod \"nmstate-console-plugin-5dcbbd79cf-46jf6\" (UID: \"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.695625 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.701519 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cc5dbf94f-ctqtm"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.702633 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.722378 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cc5dbf94f-ctqtm"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.738107 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-oauth-config\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.738168 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-service-ca\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.738189 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-trusted-ca-bundle\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.738241 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-oauth-serving-cert\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.738260 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbklg\" (UniqueName: \"kubernetes.io/projected/59a5e5f3-13c4-414f-8016-9d222e81f36d-kube-api-access-hbklg\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.738300 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-serving-cert\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.738319 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-config\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.799380 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.840482 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-oauth-config\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.840584 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-service-ca\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.840614 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-trusted-ca-bundle\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.840691 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-oauth-serving-cert\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.840724 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbklg\" (UniqueName: \"kubernetes.io/projected/59a5e5f3-13c4-414f-8016-9d222e81f36d-kube-api-access-hbklg\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.840766 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-serving-cert\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.840801 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-config\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.841579 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-service-ca\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.842256 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-config\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.842868 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-oauth-serving-cert\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.842963 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a5e5f3-13c4-414f-8016-9d222e81f36d-trusted-ca-bundle\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.845832 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-oauth-config\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.853250 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a5e5f3-13c4-414f-8016-9d222e81f36d-console-serving-cert\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.857050 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbklg\" (UniqueName: \"kubernetes.io/projected/59a5e5f3-13c4-414f-8016-9d222e81f36d-kube-api-access-hbklg\") pod \"console-6cc5dbf94f-ctqtm\" (UID: \"59a5e5f3-13c4-414f-8016-9d222e81f36d\") " pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.891662 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.905614 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-bdf7n"] Feb 27 06:25:17 crc kubenswrapper[4725]: I0227 06:25:17.941670 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.003246 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6"] Feb 27 06:25:18 crc kubenswrapper[4725]: W0227 06:25:18.007646 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod450f9ae0_dd5c_4f1a_ad98_651d6cfe09ea.slice/crio-3b2e70ee799a0645f01e7373c5c920973a49c21d5ccef0e4b14a57a8dd0058a0 WatchSource:0}: Error finding container 3b2e70ee799a0645f01e7373c5c920973a49c21d5ccef0e4b14a57a8dd0058a0: Status 404 returned error can't find the container with id 3b2e70ee799a0645f01e7373c5c920973a49c21d5ccef0e4b14a57a8dd0058a0 Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.031062 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.045342 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7b2eb7dd-736f-4c16-8630-ed2a8607e094-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-6sx27\" (UID: \"7b2eb7dd-736f-4c16-8630-ed2a8607e094\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.050189 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7b2eb7dd-736f-4c16-8630-ed2a8607e094-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-6sx27\" (UID: \"7b2eb7dd-736f-4c16-8630-ed2a8607e094\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.133762 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qcfsz"] Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.249269 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cc5dbf94f-ctqtm"] Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.297838 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.653805 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cc5dbf94f-ctqtm" event={"ID":"59a5e5f3-13c4-414f-8016-9d222e81f36d","Type":"ContainerStarted","Data":"7d16b3edf61f1f7066d0e9ad2c02aea6ecc6927c0a60c112370685b7e20b33ef"} Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.654259 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cc5dbf94f-ctqtm" event={"ID":"59a5e5f3-13c4-414f-8016-9d222e81f36d","Type":"ContainerStarted","Data":"a58c37d1ef9708b1637c5298dfaed163c0991b3a3b42c97058cad9be12d9d7ea"} Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.654955 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" event={"ID":"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea","Type":"ContainerStarted","Data":"3b2e70ee799a0645f01e7373c5c920973a49c21d5ccef0e4b14a57a8dd0058a0"} Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.656275 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z2kll" event={"ID":"3f5fe9d7-0289-42a1-a991-3d0285038f72","Type":"ContainerStarted","Data":"a4c3e63fe592e274f4894c0243341d63d28ffa66e66808955e60d03b472987ce"} Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.657309 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" event={"ID":"73608f57-a852-439f-82b8-364a37b0e88c","Type":"ContainerStarted","Data":"306101bf6ae89cc4a3d2d6f07908acc691c2999dbad08246687263483a78db97"} Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.779374 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cc5dbf94f-ctqtm" podStartSLOduration=1.7793145460000002 podStartE2EDuration="1.779314546s" podCreationTimestamp="2026-02-27 06:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:25:18.679475644 +0000 UTC m=+897.142096233" watchObservedRunningTime="2026-02-27 06:25:18.779314546 +0000 UTC m=+897.241935155" Feb 27 06:25:18 crc kubenswrapper[4725]: I0227 06:25:18.788552 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-6sx27"] Feb 27 06:25:18 crc kubenswrapper[4725]: W0227 06:25:18.814975 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b2eb7dd_736f_4c16_8630_ed2a8607e094.slice/crio-c08069f12b7cf303827076f9133310f5008830943979513cdb842ed1dfa2535a WatchSource:0}: Error finding container c08069f12b7cf303827076f9133310f5008830943979513cdb842ed1dfa2535a: Status 404 returned error can't find the container with id c08069f12b7cf303827076f9133310f5008830943979513cdb842ed1dfa2535a Feb 27 06:25:19 crc kubenswrapper[4725]: I0227 06:25:19.673501 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" event={"ID":"7b2eb7dd-736f-4c16-8630-ed2a8607e094","Type":"ContainerStarted","Data":"c08069f12b7cf303827076f9133310f5008830943979513cdb842ed1dfa2535a"} Feb 27 06:25:19 crc kubenswrapper[4725]: I0227 06:25:19.673608 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qcfsz" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="registry-server" containerID="cri-o://ef08b98531a14e60b8ce7e009111c03f8473246c04a0e4ddbc9fd3b3e383d272" gracePeriod=2 Feb 27 06:25:20 crc kubenswrapper[4725]: I0227 06:25:20.685912 4725 generic.go:334] "Generic (PLEG): container finished" podID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerID="ef08b98531a14e60b8ce7e009111c03f8473246c04a0e4ddbc9fd3b3e383d272" exitCode=0 Feb 27 06:25:20 crc kubenswrapper[4725]: I0227 06:25:20.685979 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcfsz" event={"ID":"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1","Type":"ContainerDied","Data":"ef08b98531a14e60b8ce7e009111c03f8473246c04a0e4ddbc9fd3b3e383d272"} Feb 27 06:25:20 crc kubenswrapper[4725]: I0227 06:25:20.991395 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.106370 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-utilities\") pod \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.106493 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-catalog-content\") pod \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.106633 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwct5\" (UniqueName: \"kubernetes.io/projected/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-kube-api-access-rwct5\") pod \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\" (UID: \"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1\") " Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.109588 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-utilities" (OuterVolumeSpecName: "utilities") pod "de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" (UID: "de0504e9-97ee-4ce8-b5a4-a5d7c22199b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.122839 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-kube-api-access-rwct5" (OuterVolumeSpecName: "kube-api-access-rwct5") pod "de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" (UID: "de0504e9-97ee-4ce8-b5a4-a5d7c22199b1"). InnerVolumeSpecName "kube-api-access-rwct5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.209136 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwct5\" (UniqueName: \"kubernetes.io/projected/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-kube-api-access-rwct5\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.209185 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.245700 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" (UID: "de0504e9-97ee-4ce8-b5a4-a5d7c22199b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.310485 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.698801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcfsz" event={"ID":"de0504e9-97ee-4ce8-b5a4-a5d7c22199b1","Type":"ContainerDied","Data":"7a238a22e8289454e690314e0064cd02fbd9e50788a129f3c38fb31a1442e5a2"} Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.698860 4725 scope.go:117] "RemoveContainer" containerID="ef08b98531a14e60b8ce7e009111c03f8473246c04a0e4ddbc9fd3b3e383d272" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.698888 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcfsz" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.733799 4725 scope.go:117] "RemoveContainer" containerID="24f24019a113f9965f2b3cf71b44a4fb2aefff1e25abd001ef0d50e0359d27ca" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.762573 4725 scope.go:117] "RemoveContainer" containerID="01707ca5e45644c627654219557cc6088fe49d8cab878bc2f10448ae67244fad" Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.763358 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qcfsz"] Feb 27 06:25:21 crc kubenswrapper[4725]: I0227 06:25:21.768993 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qcfsz"] Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.258455 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" path="/var/lib/kubelet/pods/de0504e9-97ee-4ce8-b5a4-a5d7c22199b1/volumes" Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.712760 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" event={"ID":"450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea","Type":"ContainerStarted","Data":"b2e89de401f3d6897e9cde7d52dc141063988c30d1d51eacad90a9f7138e3bde"} Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.716144 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" event={"ID":"7b2eb7dd-736f-4c16-8630-ed2a8607e094","Type":"ContainerStarted","Data":"4f316cdc8562e95ec911b1277aeebb9a41200eeeae415e4b5f92a7aae86e6165"} Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.716805 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.719248 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z2kll" event={"ID":"3f5fe9d7-0289-42a1-a991-3d0285038f72","Type":"ContainerStarted","Data":"46fe819009ab7a7adf6bfe4546bf4156f6af10d92944bf092d4d5d448578c2d8"} Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.719421 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.721980 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" event={"ID":"73608f57-a852-439f-82b8-364a37b0e88c","Type":"ContainerStarted","Data":"c1829b77c9feabdb2d8e30ee778a2d170c24406f2cad4262e306e8a7674040ab"} Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.739674 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-46jf6" podStartSLOduration=2.130879716 podStartE2EDuration="5.73965705s" podCreationTimestamp="2026-02-27 06:25:17 +0000 UTC" firstStartedPulling="2026-02-27 06:25:18.010146386 +0000 UTC m=+896.472766955" lastFinishedPulling="2026-02-27 06:25:21.61892372 +0000 UTC m=+900.081544289" observedRunningTime="2026-02-27 06:25:22.73932411 +0000 UTC m=+901.201944749" watchObservedRunningTime="2026-02-27 06:25:22.73965705 +0000 UTC m=+901.202277629" Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.776594 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" podStartSLOduration=2.971366573 podStartE2EDuration="5.776577479s" podCreationTimestamp="2026-02-27 06:25:17 +0000 UTC" firstStartedPulling="2026-02-27 06:25:18.818369275 +0000 UTC m=+897.280989854" lastFinishedPulling="2026-02-27 06:25:21.623580151 +0000 UTC m=+900.086200760" observedRunningTime="2026-02-27 06:25:22.773959006 +0000 UTC m=+901.236579595" watchObservedRunningTime="2026-02-27 06:25:22.776577479 +0000 UTC m=+901.239198058" Feb 27 06:25:22 crc kubenswrapper[4725]: I0227 06:25:22.798355 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-z2kll" podStartSLOduration=1.894784257 podStartE2EDuration="5.79825273s" podCreationTimestamp="2026-02-27 06:25:17 +0000 UTC" firstStartedPulling="2026-02-27 06:25:17.75211143 +0000 UTC m=+896.214731999" lastFinishedPulling="2026-02-27 06:25:21.655579903 +0000 UTC m=+900.118200472" observedRunningTime="2026-02-27 06:25:22.790393738 +0000 UTC m=+901.253014327" watchObservedRunningTime="2026-02-27 06:25:22.79825273 +0000 UTC m=+901.260873349" Feb 27 06:25:25 crc kubenswrapper[4725]: I0227 06:25:25.748647 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" event={"ID":"73608f57-a852-439f-82b8-364a37b0e88c","Type":"ContainerStarted","Data":"18bbb326c202fd79a8247520252edc12a4fc75e9fb309df03afcd5aea98cb94f"} Feb 27 06:25:25 crc kubenswrapper[4725]: I0227 06:25:25.776447 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-bdf7n" podStartSLOduration=2.080570139 podStartE2EDuration="8.776417795s" podCreationTimestamp="2026-02-27 06:25:17 +0000 UTC" firstStartedPulling="2026-02-27 06:25:17.919343069 +0000 UTC m=+896.381963638" lastFinishedPulling="2026-02-27 06:25:24.615190725 +0000 UTC m=+903.077811294" observedRunningTime="2026-02-27 06:25:25.7737528 +0000 UTC m=+904.236373449" watchObservedRunningTime="2026-02-27 06:25:25.776417795 +0000 UTC m=+904.239038444" Feb 27 06:25:27 crc kubenswrapper[4725]: I0227 06:25:27.736621 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-z2kll" Feb 27 06:25:28 crc kubenswrapper[4725]: I0227 06:25:28.031573 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:28 crc kubenswrapper[4725]: I0227 06:25:28.031901 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:28 crc kubenswrapper[4725]: I0227 06:25:28.039346 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:28 crc kubenswrapper[4725]: I0227 06:25:28.779995 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cc5dbf94f-ctqtm" Feb 27 06:25:28 crc kubenswrapper[4725]: I0227 06:25:28.862383 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qr9hb"] Feb 27 06:25:38 crc kubenswrapper[4725]: I0227 06:25:38.307409 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-6sx27" Feb 27 06:25:53 crc kubenswrapper[4725]: I0227 06:25:53.943033 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-qr9hb" podUID="b0dcb291-e867-45d5-91f3-fa9b18a090c5" containerName="console" containerID="cri-o://413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1" gracePeriod=15 Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.455338 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qr9hb_b0dcb291-e867-45d5-91f3-fa9b18a090c5/console/0.log" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.455720 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.546616 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-trusted-ca-bundle\") pod \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.546669 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-serving-cert\") pod \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.546734 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-service-ca\") pod \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.546760 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-config\") pod \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.546823 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brjww\" (UniqueName: \"kubernetes.io/projected/b0dcb291-e867-45d5-91f3-fa9b18a090c5-kube-api-access-brjww\") pod \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.546850 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-oauth-serving-cert\") pod \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.546881 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-oauth-config\") pod \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\" (UID: \"b0dcb291-e867-45d5-91f3-fa9b18a090c5\") " Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.548584 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-service-ca" (OuterVolumeSpecName: "service-ca") pod "b0dcb291-e867-45d5-91f3-fa9b18a090c5" (UID: "b0dcb291-e867-45d5-91f3-fa9b18a090c5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.548596 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b0dcb291-e867-45d5-91f3-fa9b18a090c5" (UID: "b0dcb291-e867-45d5-91f3-fa9b18a090c5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.549031 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b0dcb291-e867-45d5-91f3-fa9b18a090c5" (UID: "b0dcb291-e867-45d5-91f3-fa9b18a090c5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.549237 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-config" (OuterVolumeSpecName: "console-config") pod "b0dcb291-e867-45d5-91f3-fa9b18a090c5" (UID: "b0dcb291-e867-45d5-91f3-fa9b18a090c5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.569614 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0dcb291-e867-45d5-91f3-fa9b18a090c5-kube-api-access-brjww" (OuterVolumeSpecName: "kube-api-access-brjww") pod "b0dcb291-e867-45d5-91f3-fa9b18a090c5" (UID: "b0dcb291-e867-45d5-91f3-fa9b18a090c5"). InnerVolumeSpecName "kube-api-access-brjww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.570327 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b0dcb291-e867-45d5-91f3-fa9b18a090c5" (UID: "b0dcb291-e867-45d5-91f3-fa9b18a090c5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.570900 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b0dcb291-e867-45d5-91f3-fa9b18a090c5" (UID: "b0dcb291-e867-45d5-91f3-fa9b18a090c5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.648254 4725 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.648336 4725 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.648346 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brjww\" (UniqueName: \"kubernetes.io/projected/b0dcb291-e867-45d5-91f3-fa9b18a090c5-kube-api-access-brjww\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.648368 4725 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.648377 4725 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.648386 4725 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dcb291-e867-45d5-91f3-fa9b18a090c5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.648396 4725 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dcb291-e867-45d5-91f3-fa9b18a090c5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.834780 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm"] Feb 27 06:25:54 crc kubenswrapper[4725]: E0227 06:25:54.834984 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="extract-content" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.834995 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="extract-content" Feb 27 06:25:54 crc kubenswrapper[4725]: E0227 06:25:54.835006 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="extract-utilities" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.835012 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="extract-utilities" Feb 27 06:25:54 crc kubenswrapper[4725]: E0227 06:25:54.835022 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0dcb291-e867-45d5-91f3-fa9b18a090c5" containerName="console" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.835029 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0dcb291-e867-45d5-91f3-fa9b18a090c5" containerName="console" Feb 27 06:25:54 crc kubenswrapper[4725]: E0227 06:25:54.835041 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="registry-server" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.835046 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="registry-server" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.835154 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="de0504e9-97ee-4ce8-b5a4-a5d7c22199b1" containerName="registry-server" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.835176 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0dcb291-e867-45d5-91f3-fa9b18a090c5" containerName="console" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.835906 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.837839 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.850873 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvj7\" (UniqueName: \"kubernetes.io/projected/7f14764c-87e1-4b38-9343-86b368d36b24-kube-api-access-4zvj7\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.851135 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.851167 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.886492 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm"] Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.952890 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvj7\" (UniqueName: \"kubernetes.io/projected/7f14764c-87e1-4b38-9343-86b368d36b24-kube-api-access-4zvj7\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.952965 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.953007 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.953831 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.953968 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:54 crc kubenswrapper[4725]: I0227 06:25:54.976887 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvj7\" (UniqueName: \"kubernetes.io/projected/7f14764c-87e1-4b38-9343-86b368d36b24-kube-api-access-4zvj7\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.004112 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qr9hb_b0dcb291-e867-45d5-91f3-fa9b18a090c5/console/0.log" Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.004178 4725 generic.go:334] "Generic (PLEG): container finished" podID="b0dcb291-e867-45d5-91f3-fa9b18a090c5" containerID="413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1" exitCode=2 Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.004223 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qr9hb" event={"ID":"b0dcb291-e867-45d5-91f3-fa9b18a090c5","Type":"ContainerDied","Data":"413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1"} Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.004261 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qr9hb" event={"ID":"b0dcb291-e867-45d5-91f3-fa9b18a090c5","Type":"ContainerDied","Data":"ff74a9c5f338fdc90d580f22fa8610a9dfe2f974661a3cf4fa9cdc50f68abcbe"} Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.004330 4725 scope.go:117] "RemoveContainer" containerID="413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1" Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.004494 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qr9hb" Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.029513 4725 scope.go:117] "RemoveContainer" containerID="413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1" Feb 27 06:25:55 crc kubenswrapper[4725]: E0227 06:25:55.030193 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1\": container with ID starting with 413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1 not found: ID does not exist" containerID="413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1" Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.030245 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1"} err="failed to get container status \"413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1\": rpc error: code = NotFound desc = could not find container \"413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1\": container with ID starting with 413d01ac1e085292a5fff233513c548171dd8e32953a5c26d15a82cc760868d1 not found: ID does not exist" Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.047509 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qr9hb"] Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.051345 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-qr9hb"] Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.156932 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:25:55 crc kubenswrapper[4725]: I0227 06:25:55.402430 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm"] Feb 27 06:25:56 crc kubenswrapper[4725]: I0227 06:25:56.020785 4725 generic.go:334] "Generic (PLEG): container finished" podID="7f14764c-87e1-4b38-9343-86b368d36b24" containerID="128eaa86a24f28f01859b03e7042af0ef901c28d707718fbb86ab256a00eceee" exitCode=0 Feb 27 06:25:56 crc kubenswrapper[4725]: I0227 06:25:56.020862 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" event={"ID":"7f14764c-87e1-4b38-9343-86b368d36b24","Type":"ContainerDied","Data":"128eaa86a24f28f01859b03e7042af0ef901c28d707718fbb86ab256a00eceee"} Feb 27 06:25:56 crc kubenswrapper[4725]: I0227 06:25:56.021161 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" event={"ID":"7f14764c-87e1-4b38-9343-86b368d36b24","Type":"ContainerStarted","Data":"87d52f5105cc1acbf122490981153022641d93686d136f40bbae8164a2f8a693"} Feb 27 06:25:56 crc kubenswrapper[4725]: I0227 06:25:56.266521 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0dcb291-e867-45d5-91f3-fa9b18a090c5" path="/var/lib/kubelet/pods/b0dcb291-e867-45d5-91f3-fa9b18a090c5/volumes" Feb 27 06:25:58 crc kubenswrapper[4725]: I0227 06:25:58.039469 4725 generic.go:334] "Generic (PLEG): container finished" podID="7f14764c-87e1-4b38-9343-86b368d36b24" containerID="0f284ece66fd6505a1343c7cf561cb6162aa4b3c2e08234e7c89ec97b4d9fdcc" exitCode=0 Feb 27 06:25:58 crc kubenswrapper[4725]: I0227 06:25:58.039544 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" event={"ID":"7f14764c-87e1-4b38-9343-86b368d36b24","Type":"ContainerDied","Data":"0f284ece66fd6505a1343c7cf561cb6162aa4b3c2e08234e7c89ec97b4d9fdcc"} Feb 27 06:25:59 crc kubenswrapper[4725]: I0227 06:25:59.052567 4725 generic.go:334] "Generic (PLEG): container finished" podID="7f14764c-87e1-4b38-9343-86b368d36b24" containerID="24e0f66d56da94b524352194829d7fdfdedaf3a811cca89746aa0f4f1635ba21" exitCode=0 Feb 27 06:25:59 crc kubenswrapper[4725]: I0227 06:25:59.052643 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" event={"ID":"7f14764c-87e1-4b38-9343-86b368d36b24","Type":"ContainerDied","Data":"24e0f66d56da94b524352194829d7fdfdedaf3a811cca89746aa0f4f1635ba21"} Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.144506 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536226-hf8lc"] Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.145698 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.154962 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.155123 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.155607 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.161959 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536226-hf8lc"] Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.231898 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfhs\" (UniqueName: \"kubernetes.io/projected/e4efe9d9-892b-4097-80f3-4d95d83f52fe-kube-api-access-7qfhs\") pod \"auto-csr-approver-29536226-hf8lc\" (UID: \"e4efe9d9-892b-4097-80f3-4d95d83f52fe\") " pod="openshift-infra/auto-csr-approver-29536226-hf8lc" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.334772 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfhs\" (UniqueName: \"kubernetes.io/projected/e4efe9d9-892b-4097-80f3-4d95d83f52fe-kube-api-access-7qfhs\") pod \"auto-csr-approver-29536226-hf8lc\" (UID: \"e4efe9d9-892b-4097-80f3-4d95d83f52fe\") " pod="openshift-infra/auto-csr-approver-29536226-hf8lc" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.360965 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfhs\" (UniqueName: \"kubernetes.io/projected/e4efe9d9-892b-4097-80f3-4d95d83f52fe-kube-api-access-7qfhs\") pod \"auto-csr-approver-29536226-hf8lc\" (UID: \"e4efe9d9-892b-4097-80f3-4d95d83f52fe\") " pod="openshift-infra/auto-csr-approver-29536226-hf8lc" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.389860 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.435935 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-bundle\") pod \"7f14764c-87e1-4b38-9343-86b368d36b24\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.436025 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-util\") pod \"7f14764c-87e1-4b38-9343-86b368d36b24\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.436163 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvj7\" (UniqueName: \"kubernetes.io/projected/7f14764c-87e1-4b38-9343-86b368d36b24-kube-api-access-4zvj7\") pod \"7f14764c-87e1-4b38-9343-86b368d36b24\" (UID: \"7f14764c-87e1-4b38-9343-86b368d36b24\") " Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.437595 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-bundle" (OuterVolumeSpecName: "bundle") pod "7f14764c-87e1-4b38-9343-86b368d36b24" (UID: "7f14764c-87e1-4b38-9343-86b368d36b24"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.440280 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f14764c-87e1-4b38-9343-86b368d36b24-kube-api-access-4zvj7" (OuterVolumeSpecName: "kube-api-access-4zvj7") pod "7f14764c-87e1-4b38-9343-86b368d36b24" (UID: "7f14764c-87e1-4b38-9343-86b368d36b24"). InnerVolumeSpecName "kube-api-access-4zvj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.457344 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-util" (OuterVolumeSpecName: "util") pod "7f14764c-87e1-4b38-9343-86b368d36b24" (UID: "7f14764c-87e1-4b38-9343-86b368d36b24"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.465550 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.539079 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvj7\" (UniqueName: \"kubernetes.io/projected/7f14764c-87e1-4b38-9343-86b368d36b24-kube-api-access-4zvj7\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.539136 4725 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.539154 4725 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f14764c-87e1-4b38-9343-86b368d36b24-util\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:00 crc kubenswrapper[4725]: I0227 06:26:00.689187 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536226-hf8lc"] Feb 27 06:26:00 crc kubenswrapper[4725]: W0227 06:26:00.697833 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4efe9d9_892b_4097_80f3_4d95d83f52fe.slice/crio-4722480ba23de29492b82c6d4dd5ab142496a63a6d6e4816a096c0275cdecc2e WatchSource:0}: Error finding container 4722480ba23de29492b82c6d4dd5ab142496a63a6d6e4816a096c0275cdecc2e: Status 404 returned error can't find the container with id 4722480ba23de29492b82c6d4dd5ab142496a63a6d6e4816a096c0275cdecc2e Feb 27 06:26:01 crc kubenswrapper[4725]: I0227 06:26:01.071525 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" event={"ID":"e4efe9d9-892b-4097-80f3-4d95d83f52fe","Type":"ContainerStarted","Data":"4722480ba23de29492b82c6d4dd5ab142496a63a6d6e4816a096c0275cdecc2e"} Feb 27 06:26:01 crc kubenswrapper[4725]: I0227 06:26:01.075583 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" event={"ID":"7f14764c-87e1-4b38-9343-86b368d36b24","Type":"ContainerDied","Data":"87d52f5105cc1acbf122490981153022641d93686d136f40bbae8164a2f8a693"} Feb 27 06:26:01 crc kubenswrapper[4725]: I0227 06:26:01.075649 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87d52f5105cc1acbf122490981153022641d93686d136f40bbae8164a2f8a693" Feb 27 06:26:01 crc kubenswrapper[4725]: I0227 06:26:01.075700 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm" Feb 27 06:26:02 crc kubenswrapper[4725]: I0227 06:26:02.096263 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" event={"ID":"e4efe9d9-892b-4097-80f3-4d95d83f52fe","Type":"ContainerStarted","Data":"6e97d5c93ea25766734af3f35175af30c69bcdd8660a6bac808b212878cd4555"} Feb 27 06:26:02 crc kubenswrapper[4725]: I0227 06:26:02.554654 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:26:02 crc kubenswrapper[4725]: I0227 06:26:02.554736 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:26:03 crc kubenswrapper[4725]: I0227 06:26:03.103984 4725 generic.go:334] "Generic (PLEG): container finished" podID="e4efe9d9-892b-4097-80f3-4d95d83f52fe" containerID="6e97d5c93ea25766734af3f35175af30c69bcdd8660a6bac808b212878cd4555" exitCode=0 Feb 27 06:26:03 crc kubenswrapper[4725]: I0227 06:26:03.104202 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" event={"ID":"e4efe9d9-892b-4097-80f3-4d95d83f52fe","Type":"ContainerDied","Data":"6e97d5c93ea25766734af3f35175af30c69bcdd8660a6bac808b212878cd4555"} Feb 27 06:26:04 crc kubenswrapper[4725]: I0227 06:26:04.432277 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" Feb 27 06:26:04 crc kubenswrapper[4725]: I0227 06:26:04.492635 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qfhs\" (UniqueName: \"kubernetes.io/projected/e4efe9d9-892b-4097-80f3-4d95d83f52fe-kube-api-access-7qfhs\") pod \"e4efe9d9-892b-4097-80f3-4d95d83f52fe\" (UID: \"e4efe9d9-892b-4097-80f3-4d95d83f52fe\") " Feb 27 06:26:04 crc kubenswrapper[4725]: I0227 06:26:04.500416 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4efe9d9-892b-4097-80f3-4d95d83f52fe-kube-api-access-7qfhs" (OuterVolumeSpecName: "kube-api-access-7qfhs") pod "e4efe9d9-892b-4097-80f3-4d95d83f52fe" (UID: "e4efe9d9-892b-4097-80f3-4d95d83f52fe"). InnerVolumeSpecName "kube-api-access-7qfhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:26:04 crc kubenswrapper[4725]: I0227 06:26:04.594449 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qfhs\" (UniqueName: \"kubernetes.io/projected/e4efe9d9-892b-4097-80f3-4d95d83f52fe-kube-api-access-7qfhs\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:05 crc kubenswrapper[4725]: I0227 06:26:05.122801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" event={"ID":"e4efe9d9-892b-4097-80f3-4d95d83f52fe","Type":"ContainerDied","Data":"4722480ba23de29492b82c6d4dd5ab142496a63a6d6e4816a096c0275cdecc2e"} Feb 27 06:26:05 crc kubenswrapper[4725]: I0227 06:26:05.122853 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4722480ba23de29492b82c6d4dd5ab142496a63a6d6e4816a096c0275cdecc2e" Feb 27 06:26:05 crc kubenswrapper[4725]: I0227 06:26:05.122887 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536226-hf8lc" Feb 27 06:26:05 crc kubenswrapper[4725]: I0227 06:26:05.174457 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536220-9dzkm"] Feb 27 06:26:05 crc kubenswrapper[4725]: I0227 06:26:05.181141 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536220-9dzkm"] Feb 27 06:26:06 crc kubenswrapper[4725]: I0227 06:26:06.263720 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4606ec17-1d6b-4af7-b13b-10ed389a0987" path="/var/lib/kubelet/pods/4606ec17-1d6b-4af7-b13b-10ed389a0987/volumes" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.002785 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-679f87455-8frdc"] Feb 27 06:26:10 crc kubenswrapper[4725]: E0227 06:26:10.004212 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4efe9d9-892b-4097-80f3-4d95d83f52fe" containerName="oc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.004357 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4efe9d9-892b-4097-80f3-4d95d83f52fe" containerName="oc" Feb 27 06:26:10 crc kubenswrapper[4725]: E0227 06:26:10.004446 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f14764c-87e1-4b38-9343-86b368d36b24" containerName="extract" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.004521 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f14764c-87e1-4b38-9343-86b368d36b24" containerName="extract" Feb 27 06:26:10 crc kubenswrapper[4725]: E0227 06:26:10.004602 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f14764c-87e1-4b38-9343-86b368d36b24" containerName="pull" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.004677 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f14764c-87e1-4b38-9343-86b368d36b24" containerName="pull" Feb 27 06:26:10 crc kubenswrapper[4725]: E0227 06:26:10.004760 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f14764c-87e1-4b38-9343-86b368d36b24" containerName="util" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.004899 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f14764c-87e1-4b38-9343-86b368d36b24" containerName="util" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.005094 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4efe9d9-892b-4097-80f3-4d95d83f52fe" containerName="oc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.005167 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f14764c-87e1-4b38-9343-86b368d36b24" containerName="extract" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.005689 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.009022 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.009084 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.023340 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-679f87455-8frdc"] Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.033470 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-k8qkn" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.034224 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.034387 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.095579 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/675e5722-7295-4f2f-acaa-7ad289facd96-webhook-cert\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.095666 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwhh\" (UniqueName: \"kubernetes.io/projected/675e5722-7295-4f2f-acaa-7ad289facd96-kube-api-access-cfwhh\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.095693 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/675e5722-7295-4f2f-acaa-7ad289facd96-apiservice-cert\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.197005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/675e5722-7295-4f2f-acaa-7ad289facd96-webhook-cert\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.197134 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhh\" (UniqueName: \"kubernetes.io/projected/675e5722-7295-4f2f-acaa-7ad289facd96-kube-api-access-cfwhh\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.197166 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/675e5722-7295-4f2f-acaa-7ad289facd96-apiservice-cert\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.203745 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/675e5722-7295-4f2f-acaa-7ad289facd96-apiservice-cert\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.203982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/675e5722-7295-4f2f-acaa-7ad289facd96-webhook-cert\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.217422 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwhh\" (UniqueName: \"kubernetes.io/projected/675e5722-7295-4f2f-acaa-7ad289facd96-kube-api-access-cfwhh\") pod \"metallb-operator-controller-manager-679f87455-8frdc\" (UID: \"675e5722-7295-4f2f-acaa-7ad289facd96\") " pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.322965 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.384018 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9"] Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.385578 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.388241 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.388337 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-ht4sv" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.388599 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.401144 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9"] Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.502019 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78958b18-878f-4fce-b4b0-d799ed1225ce-apiservice-cert\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.502337 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78958b18-878f-4fce-b4b0-d799ed1225ce-webhook-cert\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.502362 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8hc5\" (UniqueName: \"kubernetes.io/projected/78958b18-878f-4fce-b4b0-d799ed1225ce-kube-api-access-k8hc5\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.589196 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-679f87455-8frdc"] Feb 27 06:26:10 crc kubenswrapper[4725]: W0227 06:26:10.598164 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod675e5722_7295_4f2f_acaa_7ad289facd96.slice/crio-b864174c4fa9650ad3ae384fb3b29ab195792c1de81bd5e0eac94762e0aa4614 WatchSource:0}: Error finding container b864174c4fa9650ad3ae384fb3b29ab195792c1de81bd5e0eac94762e0aa4614: Status 404 returned error can't find the container with id b864174c4fa9650ad3ae384fb3b29ab195792c1de81bd5e0eac94762e0aa4614 Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.603015 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78958b18-878f-4fce-b4b0-d799ed1225ce-webhook-cert\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.603051 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8hc5\" (UniqueName: \"kubernetes.io/projected/78958b18-878f-4fce-b4b0-d799ed1225ce-kube-api-access-k8hc5\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.603129 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78958b18-878f-4fce-b4b0-d799ed1225ce-apiservice-cert\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.610212 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78958b18-878f-4fce-b4b0-d799ed1225ce-webhook-cert\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.610212 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78958b18-878f-4fce-b4b0-d799ed1225ce-apiservice-cert\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.624037 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8hc5\" (UniqueName: \"kubernetes.io/projected/78958b18-878f-4fce-b4b0-d799ed1225ce-kube-api-access-k8hc5\") pod \"metallb-operator-webhook-server-59c54c548d-8fzq9\" (UID: \"78958b18-878f-4fce-b4b0-d799ed1225ce\") " pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.703110 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.802237 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bps24"] Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.803259 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.820213 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bps24"] Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.906599 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-catalog-content\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.906665 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsgcv\" (UniqueName: \"kubernetes.io/projected/359b62ed-682f-43ae-9a58-1953f516c2d0-kube-api-access-wsgcv\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:10 crc kubenswrapper[4725]: I0227 06:26:10.906697 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-utilities\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.008414 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-catalog-content\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.008485 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsgcv\" (UniqueName: \"kubernetes.io/projected/359b62ed-682f-43ae-9a58-1953f516c2d0-kube-api-access-wsgcv\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.008517 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-utilities\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.008989 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-utilities\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.009172 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-catalog-content\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.031141 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsgcv\" (UniqueName: \"kubernetes.io/projected/359b62ed-682f-43ae-9a58-1953f516c2d0-kube-api-access-wsgcv\") pod \"community-operators-bps24\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.120312 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.160599 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" event={"ID":"675e5722-7295-4f2f-acaa-7ad289facd96","Type":"ContainerStarted","Data":"b864174c4fa9650ad3ae384fb3b29ab195792c1de81bd5e0eac94762e0aa4614"} Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.252394 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9"] Feb 27 06:26:11 crc kubenswrapper[4725]: I0227 06:26:11.601403 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bps24"] Feb 27 06:26:11 crc kubenswrapper[4725]: W0227 06:26:11.607906 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod359b62ed_682f_43ae_9a58_1953f516c2d0.slice/crio-3e5fd64b72e0c5dff22a7ba159c84a6328207604ae3d2cce39b63b8cc470d26f WatchSource:0}: Error finding container 3e5fd64b72e0c5dff22a7ba159c84a6328207604ae3d2cce39b63b8cc470d26f: Status 404 returned error can't find the container with id 3e5fd64b72e0c5dff22a7ba159c84a6328207604ae3d2cce39b63b8cc470d26f Feb 27 06:26:12 crc kubenswrapper[4725]: I0227 06:26:12.172626 4725 generic.go:334] "Generic (PLEG): container finished" podID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerID="40c7d5b95381a289a13281b21d8a33761775065cccb524091f6aebfb652c531c" exitCode=0 Feb 27 06:26:12 crc kubenswrapper[4725]: I0227 06:26:12.172695 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bps24" event={"ID":"359b62ed-682f-43ae-9a58-1953f516c2d0","Type":"ContainerDied","Data":"40c7d5b95381a289a13281b21d8a33761775065cccb524091f6aebfb652c531c"} Feb 27 06:26:12 crc kubenswrapper[4725]: I0227 06:26:12.173072 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bps24" event={"ID":"359b62ed-682f-43ae-9a58-1953f516c2d0","Type":"ContainerStarted","Data":"3e5fd64b72e0c5dff22a7ba159c84a6328207604ae3d2cce39b63b8cc470d26f"} Feb 27 06:26:12 crc kubenswrapper[4725]: I0227 06:26:12.174750 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" event={"ID":"78958b18-878f-4fce-b4b0-d799ed1225ce","Type":"ContainerStarted","Data":"6ca7abc8606a55c56b5ad0c0ff06a8eebd5f60a3a1fc742d46c06c6b142c386e"} Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.253855 4725 generic.go:334] "Generic (PLEG): container finished" podID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerID="d87f2bb04c4ad4f08bdcd2f9539d1986a2e0d60caeeab2c7c7b1eeccae3e7681" exitCode=0 Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.268640 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" event={"ID":"675e5722-7295-4f2f-acaa-7ad289facd96","Type":"ContainerStarted","Data":"ca8002e58c16bd1c77a57e470d96deef36e2a1638340eb115fdac8ae6088fd98"} Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.268704 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bps24" event={"ID":"359b62ed-682f-43ae-9a58-1953f516c2d0","Type":"ContainerDied","Data":"d87f2bb04c4ad4f08bdcd2f9539d1986a2e0d60caeeab2c7c7b1eeccae3e7681"} Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.268737 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.268759 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.268777 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" event={"ID":"78958b18-878f-4fce-b4b0-d799ed1225ce","Type":"ContainerStarted","Data":"9bd077a44e588829ee116b4613045fbaa475d44a32794bef5137d186338e1a48"} Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.294856 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" podStartSLOduration=2.810277956 podStartE2EDuration="11.294842853s" podCreationTimestamp="2026-02-27 06:26:09 +0000 UTC" firstStartedPulling="2026-02-27 06:26:10.601116667 +0000 UTC m=+949.063737236" lastFinishedPulling="2026-02-27 06:26:19.085681554 +0000 UTC m=+957.548302133" observedRunningTime="2026-02-27 06:26:20.289965356 +0000 UTC m=+958.752585925" watchObservedRunningTime="2026-02-27 06:26:20.294842853 +0000 UTC m=+958.757463422" Feb 27 06:26:20 crc kubenswrapper[4725]: I0227 06:26:20.342852 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" podStartSLOduration=2.543838934 podStartE2EDuration="10.342826655s" podCreationTimestamp="2026-02-27 06:26:10 +0000 UTC" firstStartedPulling="2026-02-27 06:26:11.308387104 +0000 UTC m=+949.771007663" lastFinishedPulling="2026-02-27 06:26:19.107374785 +0000 UTC m=+957.569995384" observedRunningTime="2026-02-27 06:26:20.338686078 +0000 UTC m=+958.801306657" watchObservedRunningTime="2026-02-27 06:26:20.342826655 +0000 UTC m=+958.805447244" Feb 27 06:26:21 crc kubenswrapper[4725]: I0227 06:26:21.269893 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bps24" event={"ID":"359b62ed-682f-43ae-9a58-1953f516c2d0","Type":"ContainerStarted","Data":"bf5742f4fdc6915f01102e306e03fe341d4f4d29ea562f3c6ffb032aca94d50a"} Feb 27 06:26:21 crc kubenswrapper[4725]: I0227 06:26:21.294269 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bps24" podStartSLOduration=2.824932989 podStartE2EDuration="11.294247967s" podCreationTimestamp="2026-02-27 06:26:10 +0000 UTC" firstStartedPulling="2026-02-27 06:26:12.174824732 +0000 UTC m=+950.637445341" lastFinishedPulling="2026-02-27 06:26:20.64413972 +0000 UTC m=+959.106760319" observedRunningTime="2026-02-27 06:26:21.288947028 +0000 UTC m=+959.751567667" watchObservedRunningTime="2026-02-27 06:26:21.294247967 +0000 UTC m=+959.756868546" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.396864 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfj6c"] Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.400078 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.406915 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfj6c"] Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.548590 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-catalog-content\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.548663 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-utilities\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.548687 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbkg\" (UniqueName: \"kubernetes.io/projected/ea7d3666-0f1e-4788-8c39-9ac740fa0763-kube-api-access-tgbkg\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.650298 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-catalog-content\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.650380 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-utilities\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.650417 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbkg\" (UniqueName: \"kubernetes.io/projected/ea7d3666-0f1e-4788-8c39-9ac740fa0763-kube-api-access-tgbkg\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.651157 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-catalog-content\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.651229 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-utilities\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.671689 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbkg\" (UniqueName: \"kubernetes.io/projected/ea7d3666-0f1e-4788-8c39-9ac740fa0763-kube-api-access-tgbkg\") pod \"redhat-marketplace-sfj6c\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.706588 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-59c54c548d-8fzq9" Feb 27 06:26:30 crc kubenswrapper[4725]: I0227 06:26:30.759418 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:31 crc kubenswrapper[4725]: I0227 06:26:31.121489 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:31 crc kubenswrapper[4725]: I0227 06:26:31.121565 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:31 crc kubenswrapper[4725]: I0227 06:26:31.165156 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfj6c"] Feb 27 06:26:31 crc kubenswrapper[4725]: I0227 06:26:31.168353 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:31 crc kubenswrapper[4725]: I0227 06:26:31.347619 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfj6c" event={"ID":"ea7d3666-0f1e-4788-8c39-9ac740fa0763","Type":"ContainerStarted","Data":"e2952a211f59206724ca1eb25a5146ddfae411be762d2e9b2b543438ff93443c"} Feb 27 06:26:31 crc kubenswrapper[4725]: I0227 06:26:31.417986 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bps24" Feb 27 06:26:32 crc kubenswrapper[4725]: I0227 06:26:32.356609 4725 generic.go:334] "Generic (PLEG): container finished" podID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerID="8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a" exitCode=0 Feb 27 06:26:32 crc kubenswrapper[4725]: I0227 06:26:32.357576 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfj6c" event={"ID":"ea7d3666-0f1e-4788-8c39-9ac740fa0763","Type":"ContainerDied","Data":"8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a"} Feb 27 06:26:32 crc kubenswrapper[4725]: I0227 06:26:32.554956 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:26:32 crc kubenswrapper[4725]: I0227 06:26:32.556069 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:26:34 crc kubenswrapper[4725]: I0227 06:26:34.013810 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bps24"] Feb 27 06:26:34 crc kubenswrapper[4725]: I0227 06:26:34.373200 4725 generic.go:334] "Generic (PLEG): container finished" podID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerID="4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4" exitCode=0 Feb 27 06:26:34 crc kubenswrapper[4725]: I0227 06:26:34.373324 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfj6c" event={"ID":"ea7d3666-0f1e-4788-8c39-9ac740fa0763","Type":"ContainerDied","Data":"4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4"} Feb 27 06:26:34 crc kubenswrapper[4725]: I0227 06:26:34.587593 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zptvt"] Feb 27 06:26:34 crc kubenswrapper[4725]: I0227 06:26:34.587996 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zptvt" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="registry-server" containerID="cri-o://6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e" gracePeriod=2 Feb 27 06:26:34 crc kubenswrapper[4725]: I0227 06:26:34.955352 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.121455 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttc99\" (UniqueName: \"kubernetes.io/projected/aca14ae6-3333-4838-9732-be9096c892ac-kube-api-access-ttc99\") pod \"aca14ae6-3333-4838-9732-be9096c892ac\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.121512 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-catalog-content\") pod \"aca14ae6-3333-4838-9732-be9096c892ac\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.121592 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-utilities\") pod \"aca14ae6-3333-4838-9732-be9096c892ac\" (UID: \"aca14ae6-3333-4838-9732-be9096c892ac\") " Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.122436 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-utilities" (OuterVolumeSpecName: "utilities") pod "aca14ae6-3333-4838-9732-be9096c892ac" (UID: "aca14ae6-3333-4838-9732-be9096c892ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.142630 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca14ae6-3333-4838-9732-be9096c892ac-kube-api-access-ttc99" (OuterVolumeSpecName: "kube-api-access-ttc99") pod "aca14ae6-3333-4838-9732-be9096c892ac" (UID: "aca14ae6-3333-4838-9732-be9096c892ac"). InnerVolumeSpecName "kube-api-access-ttc99". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.182457 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aca14ae6-3333-4838-9732-be9096c892ac" (UID: "aca14ae6-3333-4838-9732-be9096c892ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.222866 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.222904 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttc99\" (UniqueName: \"kubernetes.io/projected/aca14ae6-3333-4838-9732-be9096c892ac-kube-api-access-ttc99\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.222915 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aca14ae6-3333-4838-9732-be9096c892ac-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.382345 4725 generic.go:334] "Generic (PLEG): container finished" podID="aca14ae6-3333-4838-9732-be9096c892ac" containerID="6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e" exitCode=0 Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.382402 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zptvt" event={"ID":"aca14ae6-3333-4838-9732-be9096c892ac","Type":"ContainerDied","Data":"6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e"} Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.382436 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zptvt" event={"ID":"aca14ae6-3333-4838-9732-be9096c892ac","Type":"ContainerDied","Data":"c21617c655c31a0d947fb49ac90d6fb86a014c808015f733fcac48aa354b956e"} Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.382460 4725 scope.go:117] "RemoveContainer" containerID="6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.382637 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zptvt" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.409789 4725 scope.go:117] "RemoveContainer" containerID="c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.417432 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zptvt"] Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.421403 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zptvt"] Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.443344 4725 scope.go:117] "RemoveContainer" containerID="05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.459025 4725 scope.go:117] "RemoveContainer" containerID="6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e" Feb 27 06:26:35 crc kubenswrapper[4725]: E0227 06:26:35.459696 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e\": container with ID starting with 6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e not found: ID does not exist" containerID="6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.459833 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e"} err="failed to get container status \"6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e\": rpc error: code = NotFound desc = could not find container \"6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e\": container with ID starting with 6232c56ab15a54fcc483aa4b11e8aa6767bcd1212d21ad48723942548344c48e not found: ID does not exist" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.459954 4725 scope.go:117] "RemoveContainer" containerID="c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1" Feb 27 06:26:35 crc kubenswrapper[4725]: E0227 06:26:35.460527 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1\": container with ID starting with c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1 not found: ID does not exist" containerID="c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.460572 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1"} err="failed to get container status \"c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1\": rpc error: code = NotFound desc = could not find container \"c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1\": container with ID starting with c18e587e6c2b9dfd2f9b5b5cb0ffe885138de671a91ffa8486dcae5c7a4e6cb1 not found: ID does not exist" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.460600 4725 scope.go:117] "RemoveContainer" containerID="05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80" Feb 27 06:26:35 crc kubenswrapper[4725]: E0227 06:26:35.460928 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80\": container with ID starting with 05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80 not found: ID does not exist" containerID="05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80" Feb 27 06:26:35 crc kubenswrapper[4725]: I0227 06:26:35.460989 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80"} err="failed to get container status \"05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80\": rpc error: code = NotFound desc = could not find container \"05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80\": container with ID starting with 05e866033265525b8a3a1d2f58cea3e121f00be722f3e9bc87dd09e13359fc80 not found: ID does not exist" Feb 27 06:26:36 crc kubenswrapper[4725]: I0227 06:26:36.258606 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca14ae6-3333-4838-9732-be9096c892ac" path="/var/lib/kubelet/pods/aca14ae6-3333-4838-9732-be9096c892ac/volumes" Feb 27 06:26:36 crc kubenswrapper[4725]: I0227 06:26:36.391060 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfj6c" event={"ID":"ea7d3666-0f1e-4788-8c39-9ac740fa0763","Type":"ContainerStarted","Data":"53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee"} Feb 27 06:26:39 crc kubenswrapper[4725]: I0227 06:26:39.706230 4725 scope.go:117] "RemoveContainer" containerID="2aa66bce59acc5b8a9510321b95d44a04e9df3bed65dcc81dd688185f55a28f0" Feb 27 06:26:40 crc kubenswrapper[4725]: I0227 06:26:40.760526 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:40 crc kubenswrapper[4725]: I0227 06:26:40.760998 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:40 crc kubenswrapper[4725]: I0227 06:26:40.830717 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:40 crc kubenswrapper[4725]: I0227 06:26:40.861071 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfj6c" podStartSLOduration=7.345396413 podStartE2EDuration="10.861026026s" podCreationTimestamp="2026-02-27 06:26:30 +0000 UTC" firstStartedPulling="2026-02-27 06:26:32.359351633 +0000 UTC m=+970.821972242" lastFinishedPulling="2026-02-27 06:26:35.874981276 +0000 UTC m=+974.337601855" observedRunningTime="2026-02-27 06:26:36.404359846 +0000 UTC m=+974.866980425" watchObservedRunningTime="2026-02-27 06:26:40.861026026 +0000 UTC m=+979.323646645" Feb 27 06:26:41 crc kubenswrapper[4725]: I0227 06:26:41.487045 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:42 crc kubenswrapper[4725]: I0227 06:26:42.592763 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfj6c"] Feb 27 06:26:43 crc kubenswrapper[4725]: I0227 06:26:43.449214 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfj6c" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="registry-server" containerID="cri-o://53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee" gracePeriod=2 Feb 27 06:26:43 crc kubenswrapper[4725]: I0227 06:26:43.900914 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.054893 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-utilities\") pod \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.055365 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgbkg\" (UniqueName: \"kubernetes.io/projected/ea7d3666-0f1e-4788-8c39-9ac740fa0763-kube-api-access-tgbkg\") pod \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.056016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-catalog-content\") pod \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\" (UID: \"ea7d3666-0f1e-4788-8c39-9ac740fa0763\") " Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.056260 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-utilities" (OuterVolumeSpecName: "utilities") pod "ea7d3666-0f1e-4788-8c39-9ac740fa0763" (UID: "ea7d3666-0f1e-4788-8c39-9ac740fa0763"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.065531 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7d3666-0f1e-4788-8c39-9ac740fa0763-kube-api-access-tgbkg" (OuterVolumeSpecName: "kube-api-access-tgbkg") pod "ea7d3666-0f1e-4788-8c39-9ac740fa0763" (UID: "ea7d3666-0f1e-4788-8c39-9ac740fa0763"). InnerVolumeSpecName "kube-api-access-tgbkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.099196 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea7d3666-0f1e-4788-8c39-9ac740fa0763" (UID: "ea7d3666-0f1e-4788-8c39-9ac740fa0763"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.157465 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgbkg\" (UniqueName: \"kubernetes.io/projected/ea7d3666-0f1e-4788-8c39-9ac740fa0763-kube-api-access-tgbkg\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.157522 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.157541 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea7d3666-0f1e-4788-8c39-9ac740fa0763-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.462330 4725 generic.go:334] "Generic (PLEG): container finished" podID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerID="53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee" exitCode=0 Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.462404 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfj6c" event={"ID":"ea7d3666-0f1e-4788-8c39-9ac740fa0763","Type":"ContainerDied","Data":"53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee"} Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.462446 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfj6c" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.462508 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfj6c" event={"ID":"ea7d3666-0f1e-4788-8c39-9ac740fa0763","Type":"ContainerDied","Data":"e2952a211f59206724ca1eb25a5146ddfae411be762d2e9b2b543438ff93443c"} Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.462548 4725 scope.go:117] "RemoveContainer" containerID="53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.492078 4725 scope.go:117] "RemoveContainer" containerID="4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.496679 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfj6c"] Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.505617 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfj6c"] Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.513688 4725 scope.go:117] "RemoveContainer" containerID="8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.542144 4725 scope.go:117] "RemoveContainer" containerID="53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee" Feb 27 06:26:44 crc kubenswrapper[4725]: E0227 06:26:44.542788 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee\": container with ID starting with 53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee not found: ID does not exist" containerID="53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.543009 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee"} err="failed to get container status \"53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee\": rpc error: code = NotFound desc = could not find container \"53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee\": container with ID starting with 53d870c90594d326f5350ba5adb62e9ca6372173046e2dffe98a45506786d5ee not found: ID does not exist" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.543214 4725 scope.go:117] "RemoveContainer" containerID="4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4" Feb 27 06:26:44 crc kubenswrapper[4725]: E0227 06:26:44.543912 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4\": container with ID starting with 4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4 not found: ID does not exist" containerID="4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.543967 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4"} err="failed to get container status \"4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4\": rpc error: code = NotFound desc = could not find container \"4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4\": container with ID starting with 4edc0820db9f49815b2b818aa1f346af7eb916d1a07229db700be70bd25c85e4 not found: ID does not exist" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.544008 4725 scope.go:117] "RemoveContainer" containerID="8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a" Feb 27 06:26:44 crc kubenswrapper[4725]: E0227 06:26:44.544434 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a\": container with ID starting with 8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a not found: ID does not exist" containerID="8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a" Feb 27 06:26:44 crc kubenswrapper[4725]: I0227 06:26:44.544598 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a"} err="failed to get container status \"8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a\": rpc error: code = NotFound desc = could not find container \"8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a\": container with ID starting with 8b240e6b4e44253aca992cd00fcac1427aa98635989c7f225505e1d4f9af234a not found: ID does not exist" Feb 27 06:26:46 crc kubenswrapper[4725]: I0227 06:26:46.264957 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" path="/var/lib/kubelet/pods/ea7d3666-0f1e-4788-8c39-9ac740fa0763/volumes" Feb 27 06:26:50 crc kubenswrapper[4725]: I0227 06:26:50.326665 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-679f87455-8frdc" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118046 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lkpgf"] Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.118665 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="registry-server" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118682 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="registry-server" Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.118692 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="extract-utilities" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118699 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="extract-utilities" Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.118711 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="registry-server" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118719 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="registry-server" Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.118728 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="extract-content" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118734 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="extract-content" Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.118750 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="extract-utilities" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118756 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="extract-utilities" Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.118767 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="extract-content" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118772 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="extract-content" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118885 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca14ae6-3333-4838-9732-be9096c892ac" containerName="registry-server" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.118902 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7d3666-0f1e-4788-8c39-9ac740fa0763" containerName="registry-server" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.120900 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.121428 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn"] Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.122131 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.123124 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5nlvk" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.126511 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.126719 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.130442 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.140248 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn"] Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.239592 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-r767c"] Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.240471 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.254894 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.255080 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-js6rp" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.255188 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.255320 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.260667 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-smlh9"] Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.262440 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267127 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-conf\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267170 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gllx\" (UniqueName: \"kubernetes.io/projected/21527ec2-4ffa-49d2-9866-89690a83fa42-kube-api-access-6gllx\") pod \"frr-k8s-webhook-server-7f989f654f-vsfpn\" (UID: \"21527ec2-4ffa-49d2-9866-89690a83fa42\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267190 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-reloader\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267204 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b46659e-d3d2-46a7-a93b-1209af0baea4-metrics-certs\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267222 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-startup\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267256 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-metrics\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267279 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-sockets\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267315 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d29pj\" (UniqueName: \"kubernetes.io/projected/8b46659e-d3d2-46a7-a93b-1209af0baea4-kube-api-access-d29pj\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.267343 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21527ec2-4ffa-49d2-9866-89690a83fa42-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vsfpn\" (UID: \"21527ec2-4ffa-49d2-9866-89690a83fa42\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.268627 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.275940 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-smlh9"] Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.368712 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metrics-certs\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.368787 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-metrics\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.368839 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-sockets\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.368880 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.368931 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metallb-excludel2\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.368966 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba636a80-6000-456f-a447-d754b6d0acd2-cert\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d29pj\" (UniqueName: \"kubernetes.io/projected/8b46659e-d3d2-46a7-a93b-1209af0baea4-kube-api-access-d29pj\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369046 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba636a80-6000-456f-a447-d754b6d0acd2-metrics-certs\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369081 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckxp\" (UniqueName: \"kubernetes.io/projected/ba636a80-6000-456f-a447-d754b6d0acd2-kube-api-access-wckxp\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369122 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21527ec2-4ffa-49d2-9866-89690a83fa42-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vsfpn\" (UID: \"21527ec2-4ffa-49d2-9866-89690a83fa42\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369196 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-conf\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369232 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gllx\" (UniqueName: \"kubernetes.io/projected/21527ec2-4ffa-49d2-9866-89690a83fa42-kube-api-access-6gllx\") pod \"frr-k8s-webhook-server-7f989f654f-vsfpn\" (UID: \"21527ec2-4ffa-49d2-9866-89690a83fa42\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369264 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7rh\" (UniqueName: \"kubernetes.io/projected/0b817ed1-7c17-4e44-a421-c43b2c06ec64-kube-api-access-jn7rh\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369327 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-reloader\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369353 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b46659e-d3d2-46a7-a93b-1209af0baea4-metrics-certs\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.369382 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-startup\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.370124 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-metrics\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.370515 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-sockets\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.370913 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-startup\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.372580 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-frr-conf\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.372900 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8b46659e-d3d2-46a7-a93b-1209af0baea4-reloader\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.377381 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21527ec2-4ffa-49d2-9866-89690a83fa42-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vsfpn\" (UID: \"21527ec2-4ffa-49d2-9866-89690a83fa42\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.378584 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b46659e-d3d2-46a7-a93b-1209af0baea4-metrics-certs\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.393946 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gllx\" (UniqueName: \"kubernetes.io/projected/21527ec2-4ffa-49d2-9866-89690a83fa42-kube-api-access-6gllx\") pod \"frr-k8s-webhook-server-7f989f654f-vsfpn\" (UID: \"21527ec2-4ffa-49d2-9866-89690a83fa42\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.396887 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d29pj\" (UniqueName: \"kubernetes.io/projected/8b46659e-d3d2-46a7-a93b-1209af0baea4-kube-api-access-d29pj\") pod \"frr-k8s-lkpgf\" (UID: \"8b46659e-d3d2-46a7-a93b-1209af0baea4\") " pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.449253 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.460181 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.470337 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.470487 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metallb-excludel2\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.470605 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba636a80-6000-456f-a447-d754b6d0acd2-cert\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.470729 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba636a80-6000-456f-a447-d754b6d0acd2-metrics-certs\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.470843 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wckxp\" (UniqueName: \"kubernetes.io/projected/ba636a80-6000-456f-a447-d754b6d0acd2-kube-api-access-wckxp\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.470990 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7rh\" (UniqueName: \"kubernetes.io/projected/0b817ed1-7c17-4e44-a421-c43b2c06ec64-kube-api-access-jn7rh\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.471120 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metrics-certs\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.470730 4725 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.471430 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist podName:0b817ed1-7c17-4e44-a421-c43b2c06ec64 nodeName:}" failed. No retries permitted until 2026-02-27 06:26:51.971407834 +0000 UTC m=+990.434028403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist") pod "speaker-r767c" (UID: "0b817ed1-7c17-4e44-a421-c43b2c06ec64") : secret "metallb-memberlist" not found Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.471364 4725 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.471642 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metrics-certs podName:0b817ed1-7c17-4e44-a421-c43b2c06ec64 nodeName:}" failed. No retries permitted until 2026-02-27 06:26:51.97162045 +0000 UTC m=+990.434241029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metrics-certs") pod "speaker-r767c" (UID: "0b817ed1-7c17-4e44-a421-c43b2c06ec64") : secret "speaker-certs-secret" not found Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.471729 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metallb-excludel2\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.474470 4725 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.476530 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba636a80-6000-456f-a447-d754b6d0acd2-metrics-certs\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.485859 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba636a80-6000-456f-a447-d754b6d0acd2-cert\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.492676 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wckxp\" (UniqueName: \"kubernetes.io/projected/ba636a80-6000-456f-a447-d754b6d0acd2-kube-api-access-wckxp\") pod \"controller-86ddb6bd46-smlh9\" (UID: \"ba636a80-6000-456f-a447-d754b6d0acd2\") " pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.493965 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7rh\" (UniqueName: \"kubernetes.io/projected/0b817ed1-7c17-4e44-a421-c43b2c06ec64-kube-api-access-jn7rh\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.585966 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.980039 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metrics-certs\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.981977 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.982212 4725 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 06:26:51 crc kubenswrapper[4725]: E0227 06:26:51.982317 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist podName:0b817ed1-7c17-4e44-a421-c43b2c06ec64 nodeName:}" failed. No retries permitted until 2026-02-27 06:26:52.982295032 +0000 UTC m=+991.444915611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist") pod "speaker-r767c" (UID: "0b817ed1-7c17-4e44-a421-c43b2c06ec64") : secret "metallb-memberlist" not found Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.982484 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn"] Feb 27 06:26:51 crc kubenswrapper[4725]: I0227 06:26:51.990010 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-metrics-certs\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.050947 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-smlh9"] Feb 27 06:26:52 crc kubenswrapper[4725]: W0227 06:26:52.054892 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba636a80_6000_456f_a447_d754b6d0acd2.slice/crio-e56811032e07066efd43c78b8460585978e9486b6c96e4b5739a3756933cdeeb WatchSource:0}: Error finding container e56811032e07066efd43c78b8460585978e9486b6c96e4b5739a3756933cdeeb: Status 404 returned error can't find the container with id e56811032e07066efd43c78b8460585978e9486b6c96e4b5739a3756933cdeeb Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.541524 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-smlh9" event={"ID":"ba636a80-6000-456f-a447-d754b6d0acd2","Type":"ContainerStarted","Data":"805dafb16ec9321034fbd2568d74d9f5e1158ac9e72b0b9b936cfd63a73eaf6b"} Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.541571 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-smlh9" event={"ID":"ba636a80-6000-456f-a447-d754b6d0acd2","Type":"ContainerStarted","Data":"54fd3c16f964e769d694bbcd87a2cc0b5f10e4d0fa538b685b44b2ec3f5fcb69"} Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.541582 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-smlh9" event={"ID":"ba636a80-6000-456f-a447-d754b6d0acd2","Type":"ContainerStarted","Data":"e56811032e07066efd43c78b8460585978e9486b6c96e4b5739a3756933cdeeb"} Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.541687 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.542713 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerStarted","Data":"b657b4090e18cbac9295ca6b1b38fbf02ea98251cd8af5b9bc2d0b0db181c1bf"} Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.544280 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" event={"ID":"21527ec2-4ffa-49d2-9866-89690a83fa42","Type":"ContainerStarted","Data":"433d3fe21e8b52dae273330f7ba1c963a892b4b7c83f518010e8f936afc48243"} Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.563424 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-smlh9" podStartSLOduration=1.563396949 podStartE2EDuration="1.563396949s" podCreationTimestamp="2026-02-27 06:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:26:52.562487643 +0000 UTC m=+991.025108222" watchObservedRunningTime="2026-02-27 06:26:52.563396949 +0000 UTC m=+991.026017578" Feb 27 06:26:52 crc kubenswrapper[4725]: I0227 06:26:52.996084 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:53 crc kubenswrapper[4725]: I0227 06:26:53.003583 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b817ed1-7c17-4e44-a421-c43b2c06ec64-memberlist\") pod \"speaker-r767c\" (UID: \"0b817ed1-7c17-4e44-a421-c43b2c06ec64\") " pod="metallb-system/speaker-r767c" Feb 27 06:26:53 crc kubenswrapper[4725]: I0227 06:26:53.076222 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-r767c" Feb 27 06:26:53 crc kubenswrapper[4725]: W0227 06:26:53.112926 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b817ed1_7c17_4e44_a421_c43b2c06ec64.slice/crio-91292b635bfdeb77361c70f4d0bd1bcc23c0b4c4ccbf0a79a39fe3577e6c4984 WatchSource:0}: Error finding container 91292b635bfdeb77361c70f4d0bd1bcc23c0b4c4ccbf0a79a39fe3577e6c4984: Status 404 returned error can't find the container with id 91292b635bfdeb77361c70f4d0bd1bcc23c0b4c4ccbf0a79a39fe3577e6c4984 Feb 27 06:26:53 crc kubenswrapper[4725]: I0227 06:26:53.561069 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-r767c" event={"ID":"0b817ed1-7c17-4e44-a421-c43b2c06ec64","Type":"ContainerStarted","Data":"c99a638794d484dbd0e066783f9fe48b382b6478daca706a8f2e4fdfa16782c0"} Feb 27 06:26:53 crc kubenswrapper[4725]: I0227 06:26:53.561105 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-r767c" event={"ID":"0b817ed1-7c17-4e44-a421-c43b2c06ec64","Type":"ContainerStarted","Data":"91292b635bfdeb77361c70f4d0bd1bcc23c0b4c4ccbf0a79a39fe3577e6c4984"} Feb 27 06:26:54 crc kubenswrapper[4725]: I0227 06:26:54.569308 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-r767c" event={"ID":"0b817ed1-7c17-4e44-a421-c43b2c06ec64","Type":"ContainerStarted","Data":"a9b91c903fb3459e9cbfd9c8616ab5e7cc6ba9eec28075d6d53ebcfb1c998942"} Feb 27 06:26:54 crc kubenswrapper[4725]: I0227 06:26:54.569628 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-r767c" Feb 27 06:26:54 crc kubenswrapper[4725]: I0227 06:26:54.592584 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-r767c" podStartSLOduration=3.5925679600000002 podStartE2EDuration="3.59256796s" podCreationTimestamp="2026-02-27 06:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:26:54.591266354 +0000 UTC m=+993.053886913" watchObservedRunningTime="2026-02-27 06:26:54.59256796 +0000 UTC m=+993.055188529" Feb 27 06:26:59 crc kubenswrapper[4725]: I0227 06:26:59.625075 4725 generic.go:334] "Generic (PLEG): container finished" podID="8b46659e-d3d2-46a7-a93b-1209af0baea4" containerID="72d69809169016fe9c088ef7070cd8e838445ab0ce84f84daeab13a28807b44d" exitCode=0 Feb 27 06:26:59 crc kubenswrapper[4725]: I0227 06:26:59.625357 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerDied","Data":"72d69809169016fe9c088ef7070cd8e838445ab0ce84f84daeab13a28807b44d"} Feb 27 06:27:00 crc kubenswrapper[4725]: I0227 06:27:00.637344 4725 generic.go:334] "Generic (PLEG): container finished" podID="8b46659e-d3d2-46a7-a93b-1209af0baea4" containerID="0df979fcdc993c94873d2d58ed654634c6e57581322e351769d681817a7ebcda" exitCode=0 Feb 27 06:27:00 crc kubenswrapper[4725]: I0227 06:27:00.637411 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerDied","Data":"0df979fcdc993c94873d2d58ed654634c6e57581322e351769d681817a7ebcda"} Feb 27 06:27:00 crc kubenswrapper[4725]: I0227 06:27:00.639933 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" event={"ID":"21527ec2-4ffa-49d2-9866-89690a83fa42","Type":"ContainerStarted","Data":"19fb8c63efd449a374c4c696b6891db9ad97a5e6ac3f13a66f7aeb7230d482e9"} Feb 27 06:27:00 crc kubenswrapper[4725]: I0227 06:27:00.640169 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:27:00 crc kubenswrapper[4725]: I0227 06:27:00.702266 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" podStartSLOduration=2.221611127 podStartE2EDuration="9.702232276s" podCreationTimestamp="2026-02-27 06:26:51 +0000 UTC" firstStartedPulling="2026-02-27 06:26:51.993175559 +0000 UTC m=+990.455796138" lastFinishedPulling="2026-02-27 06:26:59.473796678 +0000 UTC m=+997.936417287" observedRunningTime="2026-02-27 06:27:00.700085206 +0000 UTC m=+999.162705855" watchObservedRunningTime="2026-02-27 06:27:00.702232276 +0000 UTC m=+999.164852875" Feb 27 06:27:01 crc kubenswrapper[4725]: I0227 06:27:01.650757 4725 generic.go:334] "Generic (PLEG): container finished" podID="8b46659e-d3d2-46a7-a93b-1209af0baea4" containerID="b56c495e46f9afb3dca674fb756420538b125760748640ec8996458470ae5917" exitCode=0 Feb 27 06:27:01 crc kubenswrapper[4725]: I0227 06:27:01.650868 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerDied","Data":"b56c495e46f9afb3dca674fb756420538b125760748640ec8996458470ae5917"} Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.554538 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.554846 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.554893 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.555472 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03dc8bea10798b61bde03e0e8912868ddc55a9db35d9f15615b091af21e96406"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.555527 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://03dc8bea10798b61bde03e0e8912868ddc55a9db35d9f15615b091af21e96406" gracePeriod=600 Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.667933 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerStarted","Data":"edde71760cef098ab8972221458ca5cfd28ded8544b9f10471296c2e3831ca3f"} Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.668920 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerStarted","Data":"5867d57a4377e20eaf551c16a4dd15d2b4e20344a8753d854e3b90c759abd142"} Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.669021 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerStarted","Data":"d0021389ba578494b942b0b4bc295a76b6ef8182b723b8d9d0508b2dd3de4927"} Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.669094 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerStarted","Data":"bc12aeccb79a6004346c7df063f8652148d6d0732f3c4046223815cd7600415f"} Feb 27 06:27:02 crc kubenswrapper[4725]: I0227 06:27:02.669156 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerStarted","Data":"601a242065ff8539e050f13faf3f94157d13dfbd950bd39cdeff5277b7682c5c"} Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.082870 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-r767c" Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.682443 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="03dc8bea10798b61bde03e0e8912868ddc55a9db35d9f15615b091af21e96406" exitCode=0 Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.682597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"03dc8bea10798b61bde03e0e8912868ddc55a9db35d9f15615b091af21e96406"} Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.682860 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"861186ed7f2e4b7df76123b88a35b60ba94275897d0a13a296d2198ea2a7a166"} Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.682925 4725 scope.go:117] "RemoveContainer" containerID="f1f293a213b8036dbb26c658eb7bfe019d4be19b3467bb55e8ef79040281b13b" Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.692736 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lkpgf" event={"ID":"8b46659e-d3d2-46a7-a93b-1209af0baea4","Type":"ContainerStarted","Data":"56ce399b71cf4a33915ed2092ecdc4650fda0149d4738db35c4972aafd93cf48"} Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.693131 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:27:03 crc kubenswrapper[4725]: I0227 06:27:03.748547 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lkpgf" podStartSLOduration=5.777250641 podStartE2EDuration="12.748521955s" podCreationTimestamp="2026-02-27 06:26:51 +0000 UTC" firstStartedPulling="2026-02-27 06:26:51.667660811 +0000 UTC m=+990.130281380" lastFinishedPulling="2026-02-27 06:26:58.638932105 +0000 UTC m=+997.101552694" observedRunningTime="2026-02-27 06:27:03.735232211 +0000 UTC m=+1002.197852810" watchObservedRunningTime="2026-02-27 06:27:03.748521955 +0000 UTC m=+1002.211142544" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.168721 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-q2t6m"] Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.170247 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q2t6m" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.173529 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-77b67" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.174601 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.176749 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.192062 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-q2t6m"] Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.316092 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwwng\" (UniqueName: \"kubernetes.io/projected/b53efac8-2a68-4fac-a4e3-9d8c4afb5c24-kube-api-access-wwwng\") pod \"openstack-operator-index-q2t6m\" (UID: \"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24\") " pod="openstack-operators/openstack-operator-index-q2t6m" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.417889 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwwng\" (UniqueName: \"kubernetes.io/projected/b53efac8-2a68-4fac-a4e3-9d8c4afb5c24-kube-api-access-wwwng\") pod \"openstack-operator-index-q2t6m\" (UID: \"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24\") " pod="openstack-operators/openstack-operator-index-q2t6m" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.450212 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.456639 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwwng\" (UniqueName: \"kubernetes.io/projected/b53efac8-2a68-4fac-a4e3-9d8c4afb5c24-kube-api-access-wwwng\") pod \"openstack-operator-index-q2t6m\" (UID: \"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24\") " pod="openstack-operators/openstack-operator-index-q2t6m" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.499521 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q2t6m" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.504791 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:27:06 crc kubenswrapper[4725]: I0227 06:27:06.789177 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-q2t6m"] Feb 27 06:27:07 crc kubenswrapper[4725]: I0227 06:27:07.730274 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q2t6m" event={"ID":"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24","Type":"ContainerStarted","Data":"e08b8176d122a6cf3286e75ab055e7d0d8c465e01c0918cd0c5fae8371e1e0d3"} Feb 27 06:27:09 crc kubenswrapper[4725]: I0227 06:27:09.529214 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-q2t6m"] Feb 27 06:27:09 crc kubenswrapper[4725]: I0227 06:27:09.745963 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q2t6m" event={"ID":"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24","Type":"ContainerStarted","Data":"2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29"} Feb 27 06:27:09 crc kubenswrapper[4725]: I0227 06:27:09.777848 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-q2t6m" podStartSLOduration=1.395538561 podStartE2EDuration="3.777814138s" podCreationTimestamp="2026-02-27 06:27:06 +0000 UTC" firstStartedPulling="2026-02-27 06:27:06.796672085 +0000 UTC m=+1005.259292694" lastFinishedPulling="2026-02-27 06:27:09.178947712 +0000 UTC m=+1007.641568271" observedRunningTime="2026-02-27 06:27:09.766999073 +0000 UTC m=+1008.229619702" watchObservedRunningTime="2026-02-27 06:27:09.777814138 +0000 UTC m=+1008.240434747" Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.145995 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-r75bv"] Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.147487 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.161436 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r75bv"] Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.292451 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffl98\" (UniqueName: \"kubernetes.io/projected/d6b63be0-6e6a-4e30-8648-28a0174338a4-kube-api-access-ffl98\") pod \"openstack-operator-index-r75bv\" (UID: \"d6b63be0-6e6a-4e30-8648-28a0174338a4\") " pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.394460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffl98\" (UniqueName: \"kubernetes.io/projected/d6b63be0-6e6a-4e30-8648-28a0174338a4-kube-api-access-ffl98\") pod \"openstack-operator-index-r75bv\" (UID: \"d6b63be0-6e6a-4e30-8648-28a0174338a4\") " pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.417907 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffl98\" (UniqueName: \"kubernetes.io/projected/d6b63be0-6e6a-4e30-8648-28a0174338a4-kube-api-access-ffl98\") pod \"openstack-operator-index-r75bv\" (UID: \"d6b63be0-6e6a-4e30-8648-28a0174338a4\") " pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.514742 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.756711 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-q2t6m" podUID="b53efac8-2a68-4fac-a4e3-9d8c4afb5c24" containerName="registry-server" containerID="cri-o://2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29" gracePeriod=2 Feb 27 06:27:10 crc kubenswrapper[4725]: I0227 06:27:10.831088 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r75bv"] Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.084496 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q2t6m" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.204185 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwwng\" (UniqueName: \"kubernetes.io/projected/b53efac8-2a68-4fac-a4e3-9d8c4afb5c24-kube-api-access-wwwng\") pod \"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24\" (UID: \"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24\") " Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.210672 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53efac8-2a68-4fac-a4e3-9d8c4afb5c24-kube-api-access-wwwng" (OuterVolumeSpecName: "kube-api-access-wwwng") pod "b53efac8-2a68-4fac-a4e3-9d8c4afb5c24" (UID: "b53efac8-2a68-4fac-a4e3-9d8c4afb5c24"). InnerVolumeSpecName "kube-api-access-wwwng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.306774 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwwng\" (UniqueName: \"kubernetes.io/projected/b53efac8-2a68-4fac-a4e3-9d8c4afb5c24-kube-api-access-wwwng\") on node \"crc\" DevicePath \"\"" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.456683 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lkpgf" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.472958 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vsfpn" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.589456 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-smlh9" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.769687 4725 generic.go:334] "Generic (PLEG): container finished" podID="b53efac8-2a68-4fac-a4e3-9d8c4afb5c24" containerID="2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29" exitCode=0 Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.769736 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q2t6m" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.769793 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q2t6m" event={"ID":"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24","Type":"ContainerDied","Data":"2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29"} Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.769834 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q2t6m" event={"ID":"b53efac8-2a68-4fac-a4e3-9d8c4afb5c24","Type":"ContainerDied","Data":"e08b8176d122a6cf3286e75ab055e7d0d8c465e01c0918cd0c5fae8371e1e0d3"} Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.769862 4725 scope.go:117] "RemoveContainer" containerID="2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.778666 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r75bv" event={"ID":"d6b63be0-6e6a-4e30-8648-28a0174338a4","Type":"ContainerStarted","Data":"0ca703b35deacdb8ea6b326b9cec7041979a6ecb3fd26b92293201e7503526cf"} Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.778742 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r75bv" event={"ID":"d6b63be0-6e6a-4e30-8648-28a0174338a4","Type":"ContainerStarted","Data":"35fce9e2cf8ac1043a6ad72e01a0ce244cb726945ecd459b82b5aa9037d5b897"} Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.799910 4725 scope.go:117] "RemoveContainer" containerID="2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29" Feb 27 06:27:11 crc kubenswrapper[4725]: E0227 06:27:11.800611 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29\": container with ID starting with 2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29 not found: ID does not exist" containerID="2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.800654 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29"} err="failed to get container status \"2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29\": rpc error: code = NotFound desc = could not find container \"2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29\": container with ID starting with 2b93049b852333a5be55e827e965ccb82a3546cd7b2d9d665762775b66615d29 not found: ID does not exist" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.812212 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-r75bv" podStartSLOduration=1.765483339 podStartE2EDuration="1.812184305s" podCreationTimestamp="2026-02-27 06:27:10 +0000 UTC" firstStartedPulling="2026-02-27 06:27:10.871434079 +0000 UTC m=+1009.334054648" lastFinishedPulling="2026-02-27 06:27:10.918135045 +0000 UTC m=+1009.380755614" observedRunningTime="2026-02-27 06:27:11.80560004 +0000 UTC m=+1010.268220649" watchObservedRunningTime="2026-02-27 06:27:11.812184305 +0000 UTC m=+1010.274804934" Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.826331 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-q2t6m"] Feb 27 06:27:11 crc kubenswrapper[4725]: I0227 06:27:11.831773 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-q2t6m"] Feb 27 06:27:12 crc kubenswrapper[4725]: I0227 06:27:12.267212 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53efac8-2a68-4fac-a4e3-9d8c4afb5c24" path="/var/lib/kubelet/pods/b53efac8-2a68-4fac-a4e3-9d8c4afb5c24/volumes" Feb 27 06:27:20 crc kubenswrapper[4725]: I0227 06:27:20.516685 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:20 crc kubenswrapper[4725]: I0227 06:27:20.517493 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:20 crc kubenswrapper[4725]: I0227 06:27:20.569532 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:20 crc kubenswrapper[4725]: I0227 06:27:20.916859 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-r75bv" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.381857 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr"] Feb 27 06:27:21 crc kubenswrapper[4725]: E0227 06:27:21.382125 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53efac8-2a68-4fac-a4e3-9d8c4afb5c24" containerName="registry-server" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.382139 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53efac8-2a68-4fac-a4e3-9d8c4afb5c24" containerName="registry-server" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.382275 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53efac8-2a68-4fac-a4e3-9d8c4afb5c24" containerName="registry-server" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.383363 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.386470 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-dctqb" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.395801 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr"] Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.562716 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxrd\" (UniqueName: \"kubernetes.io/projected/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-kube-api-access-8hxrd\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.562949 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-bundle\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.563053 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-util\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.663974 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-bundle\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.664068 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-util\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.664172 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxrd\" (UniqueName: \"kubernetes.io/projected/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-kube-api-access-8hxrd\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.664971 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-bundle\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.665001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-util\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.690270 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxrd\" (UniqueName: \"kubernetes.io/projected/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-kube-api-access-8hxrd\") pod \"56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:21 crc kubenswrapper[4725]: I0227 06:27:21.703708 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:22 crc kubenswrapper[4725]: I0227 06:27:22.185072 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr"] Feb 27 06:27:22 crc kubenswrapper[4725]: W0227 06:27:22.195439 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c9b70fa_9547_4fa3_b567_68ee52aa3b21.slice/crio-5d6807b310da77f341bcf1b7ce108855101cc5ffb782b31eef514e4348a4165a WatchSource:0}: Error finding container 5d6807b310da77f341bcf1b7ce108855101cc5ffb782b31eef514e4348a4165a: Status 404 returned error can't find the container with id 5d6807b310da77f341bcf1b7ce108855101cc5ffb782b31eef514e4348a4165a Feb 27 06:27:22 crc kubenswrapper[4725]: I0227 06:27:22.898455 4725 generic.go:334] "Generic (PLEG): container finished" podID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerID="a0b6cbe9f91447318ae8f61d2aa1a37f2a3e45c3d63b835f1aef64cd775b1c46" exitCode=0 Feb 27 06:27:22 crc kubenswrapper[4725]: I0227 06:27:22.898559 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" event={"ID":"9c9b70fa-9547-4fa3-b567-68ee52aa3b21","Type":"ContainerDied","Data":"a0b6cbe9f91447318ae8f61d2aa1a37f2a3e45c3d63b835f1aef64cd775b1c46"} Feb 27 06:27:22 crc kubenswrapper[4725]: I0227 06:27:22.898846 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" event={"ID":"9c9b70fa-9547-4fa3-b567-68ee52aa3b21","Type":"ContainerStarted","Data":"5d6807b310da77f341bcf1b7ce108855101cc5ffb782b31eef514e4348a4165a"} Feb 27 06:27:23 crc kubenswrapper[4725]: I0227 06:27:23.908380 4725 generic.go:334] "Generic (PLEG): container finished" podID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerID="e16ea8dcb2b50a7659dd236e2d411867c4fa23bea932c9cd4a9060681e89efd5" exitCode=0 Feb 27 06:27:23 crc kubenswrapper[4725]: I0227 06:27:23.908447 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" event={"ID":"9c9b70fa-9547-4fa3-b567-68ee52aa3b21","Type":"ContainerDied","Data":"e16ea8dcb2b50a7659dd236e2d411867c4fa23bea932c9cd4a9060681e89efd5"} Feb 27 06:27:24 crc kubenswrapper[4725]: I0227 06:27:24.921067 4725 generic.go:334] "Generic (PLEG): container finished" podID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerID="6a279f40fdb08665390cd7b6710f10fe4da8ee1fc51df14bcd64101dfc870afe" exitCode=0 Feb 27 06:27:24 crc kubenswrapper[4725]: I0227 06:27:24.921186 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" event={"ID":"9c9b70fa-9547-4fa3-b567-68ee52aa3b21","Type":"ContainerDied","Data":"6a279f40fdb08665390cd7b6710f10fe4da8ee1fc51df14bcd64101dfc870afe"} Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.271390 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.447034 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-bundle\") pod \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.447134 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hxrd\" (UniqueName: \"kubernetes.io/projected/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-kube-api-access-8hxrd\") pod \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.447210 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-util\") pod \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\" (UID: \"9c9b70fa-9547-4fa3-b567-68ee52aa3b21\") " Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.448551 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-bundle" (OuterVolumeSpecName: "bundle") pod "9c9b70fa-9547-4fa3-b567-68ee52aa3b21" (UID: "9c9b70fa-9547-4fa3-b567-68ee52aa3b21"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.456799 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-kube-api-access-8hxrd" (OuterVolumeSpecName: "kube-api-access-8hxrd") pod "9c9b70fa-9547-4fa3-b567-68ee52aa3b21" (UID: "9c9b70fa-9547-4fa3-b567-68ee52aa3b21"). InnerVolumeSpecName "kube-api-access-8hxrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.482448 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-util" (OuterVolumeSpecName: "util") pod "9c9b70fa-9547-4fa3-b567-68ee52aa3b21" (UID: "9c9b70fa-9547-4fa3-b567-68ee52aa3b21"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.548542 4725 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.548577 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hxrd\" (UniqueName: \"kubernetes.io/projected/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-kube-api-access-8hxrd\") on node \"crc\" DevicePath \"\"" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.548591 4725 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c9b70fa-9547-4fa3-b567-68ee52aa3b21-util\") on node \"crc\" DevicePath \"\"" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.947824 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" event={"ID":"9c9b70fa-9547-4fa3-b567-68ee52aa3b21","Type":"ContainerDied","Data":"5d6807b310da77f341bcf1b7ce108855101cc5ffb782b31eef514e4348a4165a"} Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.948372 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d6807b310da77f341bcf1b7ce108855101cc5ffb782b31eef514e4348a4165a" Feb 27 06:27:26 crc kubenswrapper[4725]: I0227 06:27:26.948190 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.402825 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7544f859d8-744ft"] Feb 27 06:27:34 crc kubenswrapper[4725]: E0227 06:27:34.403739 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerName="util" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.403759 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerName="util" Feb 27 06:27:34 crc kubenswrapper[4725]: E0227 06:27:34.403786 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerName="extract" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.403795 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerName="extract" Feb 27 06:27:34 crc kubenswrapper[4725]: E0227 06:27:34.403809 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerName="pull" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.403818 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerName="pull" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.403964 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9b70fa-9547-4fa3-b567-68ee52aa3b21" containerName="extract" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.404571 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.407181 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-jpmjp" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.431132 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7544f859d8-744ft"] Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.577495 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r54dv\" (UniqueName: \"kubernetes.io/projected/cc86b762-a7df-42aa-970c-76ebac88b004-kube-api-access-r54dv\") pod \"openstack-operator-controller-init-7544f859d8-744ft\" (UID: \"cc86b762-a7df-42aa-970c-76ebac88b004\") " pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.679213 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r54dv\" (UniqueName: \"kubernetes.io/projected/cc86b762-a7df-42aa-970c-76ebac88b004-kube-api-access-r54dv\") pod \"openstack-operator-controller-init-7544f859d8-744ft\" (UID: \"cc86b762-a7df-42aa-970c-76ebac88b004\") " pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.702173 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r54dv\" (UniqueName: \"kubernetes.io/projected/cc86b762-a7df-42aa-970c-76ebac88b004-kube-api-access-r54dv\") pod \"openstack-operator-controller-init-7544f859d8-744ft\" (UID: \"cc86b762-a7df-42aa-970c-76ebac88b004\") " pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" Feb 27 06:27:34 crc kubenswrapper[4725]: I0227 06:27:34.727480 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" Feb 27 06:27:35 crc kubenswrapper[4725]: I0227 06:27:35.101595 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7544f859d8-744ft"] Feb 27 06:27:36 crc kubenswrapper[4725]: I0227 06:27:36.022458 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" event={"ID":"cc86b762-a7df-42aa-970c-76ebac88b004","Type":"ContainerStarted","Data":"c7cc7bd5db66c35f3182e81a566ee30b6b8d919b2bcc9d98eb134bec12eebdf8"} Feb 27 06:27:40 crc kubenswrapper[4725]: I0227 06:27:40.059851 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" event={"ID":"cc86b762-a7df-42aa-970c-76ebac88b004","Type":"ContainerStarted","Data":"bdc79ad0613e9ed16b9ee7a18acb4c80ddff764013117392349347174f9bc91f"} Feb 27 06:27:40 crc kubenswrapper[4725]: I0227 06:27:40.060703 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" Feb 27 06:27:40 crc kubenswrapper[4725]: I0227 06:27:40.120362 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" podStartSLOduration=1.9683238539999999 podStartE2EDuration="6.120336194s" podCreationTimestamp="2026-02-27 06:27:34 +0000 UTC" firstStartedPulling="2026-02-27 06:27:35.120953208 +0000 UTC m=+1033.583573787" lastFinishedPulling="2026-02-27 06:27:39.272965558 +0000 UTC m=+1037.735586127" observedRunningTime="2026-02-27 06:27:40.114683345 +0000 UTC m=+1038.577303944" watchObservedRunningTime="2026-02-27 06:27:40.120336194 +0000 UTC m=+1038.582956803" Feb 27 06:27:44 crc kubenswrapper[4725]: I0227 06:27:44.733263 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7544f859d8-744ft" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.272410 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r2mdt"] Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.274245 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.301115 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2mdt"] Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.429475 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-catalog-content\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.429539 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xblb7\" (UniqueName: \"kubernetes.io/projected/2741416a-3f8b-4644-8d09-127648003383-kube-api-access-xblb7\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.429846 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-utilities\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.531550 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xblb7\" (UniqueName: \"kubernetes.io/projected/2741416a-3f8b-4644-8d09-127648003383-kube-api-access-xblb7\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.531664 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-utilities\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.531709 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-catalog-content\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.532169 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-catalog-content\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.532412 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-utilities\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.561593 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xblb7\" (UniqueName: \"kubernetes.io/projected/2741416a-3f8b-4644-8d09-127648003383-kube-api-access-xblb7\") pod \"certified-operators-r2mdt\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:49 crc kubenswrapper[4725]: I0227 06:27:49.642075 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:50 crc kubenswrapper[4725]: I0227 06:27:50.115323 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r2mdt"] Feb 27 06:27:50 crc kubenswrapper[4725]: I0227 06:27:50.158566 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2mdt" event={"ID":"2741416a-3f8b-4644-8d09-127648003383","Type":"ContainerStarted","Data":"cc53d8b9a1732ce9895f0fed6ea5153479ade2222da3edc1fa01e143600b4d9f"} Feb 27 06:27:51 crc kubenswrapper[4725]: I0227 06:27:51.194741 4725 generic.go:334] "Generic (PLEG): container finished" podID="2741416a-3f8b-4644-8d09-127648003383" containerID="d0a6724dc4ca9704885baf25007b166517bd939c8f271cd680da457faa0cd665" exitCode=0 Feb 27 06:27:51 crc kubenswrapper[4725]: I0227 06:27:51.194815 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2mdt" event={"ID":"2741416a-3f8b-4644-8d09-127648003383","Type":"ContainerDied","Data":"d0a6724dc4ca9704885baf25007b166517bd939c8f271cd680da457faa0cd665"} Feb 27 06:27:52 crc kubenswrapper[4725]: I0227 06:27:52.203629 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2mdt" event={"ID":"2741416a-3f8b-4644-8d09-127648003383","Type":"ContainerStarted","Data":"5a3143e9fc3ab467ee49e8a314ffab3091c5c56bccf5da22cae6fe93b6395e09"} Feb 27 06:27:53 crc kubenswrapper[4725]: I0227 06:27:53.213426 4725 generic.go:334] "Generic (PLEG): container finished" podID="2741416a-3f8b-4644-8d09-127648003383" containerID="5a3143e9fc3ab467ee49e8a314ffab3091c5c56bccf5da22cae6fe93b6395e09" exitCode=0 Feb 27 06:27:53 crc kubenswrapper[4725]: I0227 06:27:53.213526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2mdt" event={"ID":"2741416a-3f8b-4644-8d09-127648003383","Type":"ContainerDied","Data":"5a3143e9fc3ab467ee49e8a314ffab3091c5c56bccf5da22cae6fe93b6395e09"} Feb 27 06:27:54 crc kubenswrapper[4725]: I0227 06:27:54.224264 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2mdt" event={"ID":"2741416a-3f8b-4644-8d09-127648003383","Type":"ContainerStarted","Data":"95a5e38e8336dea5317ecdf599a801da567be83920295672612c1fcc5355a6e6"} Feb 27 06:27:54 crc kubenswrapper[4725]: I0227 06:27:54.268491 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r2mdt" podStartSLOduration=2.858363011 podStartE2EDuration="5.26844688s" podCreationTimestamp="2026-02-27 06:27:49 +0000 UTC" firstStartedPulling="2026-02-27 06:27:51.198471246 +0000 UTC m=+1049.661091815" lastFinishedPulling="2026-02-27 06:27:53.608555105 +0000 UTC m=+1052.071175684" observedRunningTime="2026-02-27 06:27:54.267795372 +0000 UTC m=+1052.730415951" watchObservedRunningTime="2026-02-27 06:27:54.26844688 +0000 UTC m=+1052.731067469" Feb 27 06:27:59 crc kubenswrapper[4725]: I0227 06:27:59.642676 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:59 crc kubenswrapper[4725]: I0227 06:27:59.643482 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:27:59 crc kubenswrapper[4725]: I0227 06:27:59.735067 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.125105 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536228-jrqtv"] Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.126117 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.129919 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.129922 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.130189 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.139631 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536228-jrqtv"] Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.285869 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhgv8\" (UniqueName: \"kubernetes.io/projected/c0efcde2-60c3-4d14-bec9-056e06640cc6-kube-api-access-bhgv8\") pod \"auto-csr-approver-29536228-jrqtv\" (UID: \"c0efcde2-60c3-4d14-bec9-056e06640cc6\") " pod="openshift-infra/auto-csr-approver-29536228-jrqtv" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.309667 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.383740 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2mdt"] Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.387920 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhgv8\" (UniqueName: \"kubernetes.io/projected/c0efcde2-60c3-4d14-bec9-056e06640cc6-kube-api-access-bhgv8\") pod \"auto-csr-approver-29536228-jrqtv\" (UID: \"c0efcde2-60c3-4d14-bec9-056e06640cc6\") " pod="openshift-infra/auto-csr-approver-29536228-jrqtv" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.404914 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhgv8\" (UniqueName: \"kubernetes.io/projected/c0efcde2-60c3-4d14-bec9-056e06640cc6-kube-api-access-bhgv8\") pod \"auto-csr-approver-29536228-jrqtv\" (UID: \"c0efcde2-60c3-4d14-bec9-056e06640cc6\") " pod="openshift-infra/auto-csr-approver-29536228-jrqtv" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.446425 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" Feb 27 06:28:00 crc kubenswrapper[4725]: I0227 06:28:00.758572 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536228-jrqtv"] Feb 27 06:28:01 crc kubenswrapper[4725]: I0227 06:28:01.269135 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" event={"ID":"c0efcde2-60c3-4d14-bec9-056e06640cc6","Type":"ContainerStarted","Data":"a181bccb818a970d80a43c0778a128cd21b5b339de36d5d00416da51441d7721"} Feb 27 06:28:02 crc kubenswrapper[4725]: I0227 06:28:02.275352 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r2mdt" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="registry-server" containerID="cri-o://95a5e38e8336dea5317ecdf599a801da567be83920295672612c1fcc5355a6e6" gracePeriod=2 Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.288007 4725 generic.go:334] "Generic (PLEG): container finished" podID="2741416a-3f8b-4644-8d09-127648003383" containerID="95a5e38e8336dea5317ecdf599a801da567be83920295672612c1fcc5355a6e6" exitCode=0 Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.288109 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2mdt" event={"ID":"2741416a-3f8b-4644-8d09-127648003383","Type":"ContainerDied","Data":"95a5e38e8336dea5317ecdf599a801da567be83920295672612c1fcc5355a6e6"} Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.814106 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.944270 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xblb7\" (UniqueName: \"kubernetes.io/projected/2741416a-3f8b-4644-8d09-127648003383-kube-api-access-xblb7\") pod \"2741416a-3f8b-4644-8d09-127648003383\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.944344 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-catalog-content\") pod \"2741416a-3f8b-4644-8d09-127648003383\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.944428 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-utilities\") pod \"2741416a-3f8b-4644-8d09-127648003383\" (UID: \"2741416a-3f8b-4644-8d09-127648003383\") " Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.949393 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-utilities" (OuterVolumeSpecName: "utilities") pod "2741416a-3f8b-4644-8d09-127648003383" (UID: "2741416a-3f8b-4644-8d09-127648003383"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:28:03 crc kubenswrapper[4725]: I0227 06:28:03.964537 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2741416a-3f8b-4644-8d09-127648003383-kube-api-access-xblb7" (OuterVolumeSpecName: "kube-api-access-xblb7") pod "2741416a-3f8b-4644-8d09-127648003383" (UID: "2741416a-3f8b-4644-8d09-127648003383"). InnerVolumeSpecName "kube-api-access-xblb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.008328 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2741416a-3f8b-4644-8d09-127648003383" (UID: "2741416a-3f8b-4644-8d09-127648003383"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.046100 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xblb7\" (UniqueName: \"kubernetes.io/projected/2741416a-3f8b-4644-8d09-127648003383-kube-api-access-xblb7\") on node \"crc\" DevicePath \"\"" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.046139 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.046155 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2741416a-3f8b-4644-8d09-127648003383-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.297891 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r2mdt" event={"ID":"2741416a-3f8b-4644-8d09-127648003383","Type":"ContainerDied","Data":"cc53d8b9a1732ce9895f0fed6ea5153479ade2222da3edc1fa01e143600b4d9f"} Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.297950 4725 scope.go:117] "RemoveContainer" containerID="95a5e38e8336dea5317ecdf599a801da567be83920295672612c1fcc5355a6e6" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.297934 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r2mdt" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.299888 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" event={"ID":"c0efcde2-60c3-4d14-bec9-056e06640cc6","Type":"ContainerStarted","Data":"f389f3767c97b06e8758ffb74af43d9d796a2c4122053af652e1aed939e58b65"} Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.318105 4725 scope.go:117] "RemoveContainer" containerID="5a3143e9fc3ab467ee49e8a314ffab3091c5c56bccf5da22cae6fe93b6395e09" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.336869 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" podStartSLOduration=1.246791033 podStartE2EDuration="4.336850903s" podCreationTimestamp="2026-02-27 06:28:00 +0000 UTC" firstStartedPulling="2026-02-27 06:28:00.770427306 +0000 UTC m=+1059.233047875" lastFinishedPulling="2026-02-27 06:28:03.860487186 +0000 UTC m=+1062.323107745" observedRunningTime="2026-02-27 06:28:04.335773732 +0000 UTC m=+1062.798394301" watchObservedRunningTime="2026-02-27 06:28:04.336850903 +0000 UTC m=+1062.799471492" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.339460 4725 scope.go:117] "RemoveContainer" containerID="d0a6724dc4ca9704885baf25007b166517bd939c8f271cd680da457faa0cd665" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.353368 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r2mdt"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.361927 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r2mdt"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.644535 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f"] Feb 27 06:28:04 crc kubenswrapper[4725]: E0227 06:28:04.644875 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="extract-content" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.644897 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="extract-content" Feb 27 06:28:04 crc kubenswrapper[4725]: E0227 06:28:04.644919 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="registry-server" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.644928 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="registry-server" Feb 27 06:28:04 crc kubenswrapper[4725]: E0227 06:28:04.644945 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="extract-utilities" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.644953 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="extract-utilities" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.645117 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2741416a-3f8b-4644-8d09-127648003383" containerName="registry-server" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.645705 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.647749 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qhg7c" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.650546 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.651668 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.657074 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.658542 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2wwzf" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.660468 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.665476 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.666206 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.668841 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jtl69" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.674034 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.675029 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.683706 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-js2fz" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.685823 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.704809 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.734489 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.735521 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.741854 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jmcsv" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.749478 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.754119 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsl87\" (UniqueName: \"kubernetes.io/projected/a90d813f-86f2-49c9-b7d2-66d44db8236c-kube-api-access-zsl87\") pod \"cinder-operator-controller-manager-55d77d7b5c-pl57b\" (UID: \"a90d813f-86f2-49c9-b7d2-66d44db8236c\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.754187 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wnpn\" (UniqueName: \"kubernetes.io/projected/6e80b5f0-45bb-4081-808e-800527949f7e-kube-api-access-9wnpn\") pod \"barbican-operator-controller-manager-868647ff47-6sd5f\" (UID: \"6e80b5f0-45bb-4081-808e-800527949f7e\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.765260 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.766142 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.772710 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-tzmv2" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.786278 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.789490 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-llwd8"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.790352 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.792530 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.792778 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vwl4f" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.845535 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.859128 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsl87\" (UniqueName: \"kubernetes.io/projected/a90d813f-86f2-49c9-b7d2-66d44db8236c-kube-api-access-zsl87\") pod \"cinder-operator-controller-manager-55d77d7b5c-pl57b\" (UID: \"a90d813f-86f2-49c9-b7d2-66d44db8236c\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.859551 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wnpn\" (UniqueName: \"kubernetes.io/projected/6e80b5f0-45bb-4081-808e-800527949f7e-kube-api-access-9wnpn\") pod \"barbican-operator-controller-manager-868647ff47-6sd5f\" (UID: \"6e80b5f0-45bb-4081-808e-800527949f7e\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.859746 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.859884 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fffs8\" (UniqueName: \"kubernetes.io/projected/55b7330d-fa67-491c-9354-3ae2f377b245-kube-api-access-fffs8\") pod \"glance-operator-controller-manager-784b5bb6c5-48l7p\" (UID: \"55b7330d-fa67-491c-9354-3ae2f377b245\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.859952 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk25b\" (UniqueName: \"kubernetes.io/projected/f1fefb43-64d1-496a-be4b-042d68027526-kube-api-access-sk25b\") pod \"designate-operator-controller-manager-6d8bf5c495-xkffm\" (UID: \"f1fefb43-64d1-496a-be4b-042d68027526\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.859973 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2w99\" (UniqueName: \"kubernetes.io/projected/246fa0fd-dd91-4c17-9754-8ed71768660a-kube-api-access-g2w99\") pod \"heat-operator-controller-manager-69f49c598c-pklxn\" (UID: \"246fa0fd-dd91-4c17-9754-8ed71768660a\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.866677 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-x2vtj" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.868347 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.895091 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.902316 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wnpn\" (UniqueName: \"kubernetes.io/projected/6e80b5f0-45bb-4081-808e-800527949f7e-kube-api-access-9wnpn\") pod \"barbican-operator-controller-manager-868647ff47-6sd5f\" (UID: \"6e80b5f0-45bb-4081-808e-800527949f7e\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.913045 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsl87\" (UniqueName: \"kubernetes.io/projected/a90d813f-86f2-49c9-b7d2-66d44db8236c-kube-api-access-zsl87\") pod \"cinder-operator-controller-manager-55d77d7b5c-pl57b\" (UID: \"a90d813f-86f2-49c9-b7d2-66d44db8236c\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.913741 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xf7vn" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.913916 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.941621 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-llwd8"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.942614 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.950046 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.950998 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.955762 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-chkk5" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.963958 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8d6\" (UniqueName: \"kubernetes.io/projected/119c1266-bd43-49d6-a39f-93abbf47c2be-kube-api-access-vw8d6\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.964001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlt4q\" (UniqueName: \"kubernetes.io/projected/672a2ef1-a6d0-41f6-9bbf-5d157863ee48-kube-api-access-wlt4q\") pod \"horizon-operator-controller-manager-5b9b8895d5-jjgd6\" (UID: \"672a2ef1-a6d0-41f6-9bbf-5d157863ee48\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.964046 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.964074 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fffs8\" (UniqueName: \"kubernetes.io/projected/55b7330d-fa67-491c-9354-3ae2f377b245-kube-api-access-fffs8\") pod \"glance-operator-controller-manager-784b5bb6c5-48l7p\" (UID: \"55b7330d-fa67-491c-9354-3ae2f377b245\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.964094 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7wcv\" (UniqueName: \"kubernetes.io/projected/9eeeac0e-6f80-4882-8d61-effa2342d69b-kube-api-access-x7wcv\") pod \"ironic-operator-controller-manager-554564d7fc-wc9zj\" (UID: \"9eeeac0e-6f80-4882-8d61-effa2342d69b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.964121 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk25b\" (UniqueName: \"kubernetes.io/projected/f1fefb43-64d1-496a-be4b-042d68027526-kube-api-access-sk25b\") pod \"designate-operator-controller-manager-6d8bf5c495-xkffm\" (UID: \"f1fefb43-64d1-496a-be4b-042d68027526\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.964138 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2w99\" (UniqueName: \"kubernetes.io/projected/246fa0fd-dd91-4c17-9754-8ed71768660a-kube-api-access-g2w99\") pod \"heat-operator-controller-manager-69f49c598c-pklxn\" (UID: \"246fa0fd-dd91-4c17-9754-8ed71768660a\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.967829 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.972357 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.982014 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp"] Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.982929 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.986046 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.986550 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xtg6g" Feb 27 06:28:04 crc kubenswrapper[4725]: I0227 06:28:04.993344 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.003421 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.004266 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.013145 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2w99\" (UniqueName: \"kubernetes.io/projected/246fa0fd-dd91-4c17-9754-8ed71768660a-kube-api-access-g2w99\") pod \"heat-operator-controller-manager-69f49c598c-pklxn\" (UID: \"246fa0fd-dd91-4c17-9754-8ed71768660a\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.013417 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bzt47" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.013577 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fffs8\" (UniqueName: \"kubernetes.io/projected/55b7330d-fa67-491c-9354-3ae2f377b245-kube-api-access-fffs8\") pod \"glance-operator-controller-manager-784b5bb6c5-48l7p\" (UID: \"55b7330d-fa67-491c-9354-3ae2f377b245\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.026196 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk25b\" (UniqueName: \"kubernetes.io/projected/f1fefb43-64d1-496a-be4b-042d68027526-kube-api-access-sk25b\") pod \"designate-operator-controller-manager-6d8bf5c495-xkffm\" (UID: \"f1fefb43-64d1-496a-be4b-042d68027526\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.026259 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.038406 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.039267 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.041842 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-w5nwb" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.058044 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.065044 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkx5j\" (UniqueName: \"kubernetes.io/projected/1e6b09aa-e1b0-41c7-8aa0-e560de6310d5-kube-api-access-jkx5j\") pod \"keystone-operator-controller-manager-b4d948c87-4wg4t\" (UID: \"1e6b09aa-e1b0-41c7-8aa0-e560de6310d5\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.065093 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmnwx\" (UniqueName: \"kubernetes.io/projected/76de952b-76db-47de-8891-40006493cf30-kube-api-access-fmnwx\") pod \"manila-operator-controller-manager-67d996989d-zfhwz\" (UID: \"76de952b-76db-47de-8891-40006493cf30\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.065153 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8d6\" (UniqueName: \"kubernetes.io/projected/119c1266-bd43-49d6-a39f-93abbf47c2be-kube-api-access-vw8d6\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.065214 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlt4q\" (UniqueName: \"kubernetes.io/projected/672a2ef1-a6d0-41f6-9bbf-5d157863ee48-kube-api-access-wlt4q\") pod \"horizon-operator-controller-manager-5b9b8895d5-jjgd6\" (UID: \"672a2ef1-a6d0-41f6-9bbf-5d157863ee48\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.065323 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqk7\" (UniqueName: \"kubernetes.io/projected/28697286-96cb-46ad-a4a5-acc3716aba31-kube-api-access-rgqk7\") pod \"mariadb-operator-controller-manager-6994f66f48-x8wnp\" (UID: \"28697286-96cb-46ad-a4a5-acc3716aba31\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.065358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.065429 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7wcv\" (UniqueName: \"kubernetes.io/projected/9eeeac0e-6f80-4882-8d61-effa2342d69b-kube-api-access-x7wcv\") pod \"ironic-operator-controller-manager-554564d7fc-wc9zj\" (UID: \"9eeeac0e-6f80-4882-8d61-effa2342d69b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.065547 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.065611 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert podName:119c1266-bd43-49d6-a39f-93abbf47c2be nodeName:}" failed. No retries permitted until 2026-02-27 06:28:05.565594168 +0000 UTC m=+1064.028214737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert") pod "infra-operator-controller-manager-79d975b745-llwd8" (UID: "119c1266-bd43-49d6-a39f-93abbf47c2be") : secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.070577 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.071468 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.075691 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rmt6x" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.076119 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.081794 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.090712 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.091609 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.093105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8d6\" (UniqueName: \"kubernetes.io/projected/119c1266-bd43-49d6-a39f-93abbf47c2be-kube-api-access-vw8d6\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.093592 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7wcv\" (UniqueName: \"kubernetes.io/projected/9eeeac0e-6f80-4882-8d61-effa2342d69b-kube-api-access-x7wcv\") pod \"ironic-operator-controller-manager-554564d7fc-wc9zj\" (UID: \"9eeeac0e-6f80-4882-8d61-effa2342d69b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.094859 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-9lbwr" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.108586 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlt4q\" (UniqueName: \"kubernetes.io/projected/672a2ef1-a6d0-41f6-9bbf-5d157863ee48-kube-api-access-wlt4q\") pod \"horizon-operator-controller-manager-5b9b8895d5-jjgd6\" (UID: \"672a2ef1-a6d0-41f6-9bbf-5d157863ee48\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.114322 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.120822 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.122866 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-drgm4" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.123540 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.130382 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.151010 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.167811 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4b4b\" (UniqueName: \"kubernetes.io/projected/c32453ad-27be-4f95-bfc1-67878c36f13a-kube-api-access-x4b4b\") pod \"nova-operator-controller-manager-567668f5cf-pfq48\" (UID: \"c32453ad-27be-4f95-bfc1-67878c36f13a\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.167848 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58h7\" (UniqueName: \"kubernetes.io/projected/9972ea1a-4a28-4b7f-b511-9dd8dd3e0599-kube-api-access-f58h7\") pod \"ovn-operator-controller-manager-5955d8c787-vx95x\" (UID: \"9972ea1a-4a28-4b7f-b511-9dd8dd3e0599\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.167869 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqk7\" (UniqueName: \"kubernetes.io/projected/28697286-96cb-46ad-a4a5-acc3716aba31-kube-api-access-rgqk7\") pod \"mariadb-operator-controller-manager-6994f66f48-x8wnp\" (UID: \"28697286-96cb-46ad-a4a5-acc3716aba31\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.167918 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswv2\" (UniqueName: \"kubernetes.io/projected/c6de99a3-3c54-4192-8cf6-fab2c5c9750b-kube-api-access-sswv2\") pod \"octavia-operator-controller-manager-659dc6bbfc-6d6fj\" (UID: \"c6de99a3-3c54-4192-8cf6-fab2c5c9750b\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.167948 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8z9f\" (UniqueName: \"kubernetes.io/projected/1ec01345-1480-48b1-9d36-9dd8a9fc2ef8-kube-api-access-g8z9f\") pod \"neutron-operator-controller-manager-6bd4687957-4g4xh\" (UID: \"1ec01345-1480-48b1-9d36-9dd8a9fc2ef8\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.167967 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkx5j\" (UniqueName: \"kubernetes.io/projected/1e6b09aa-e1b0-41c7-8aa0-e560de6310d5-kube-api-access-jkx5j\") pod \"keystone-operator-controller-manager-b4d948c87-4wg4t\" (UID: \"1e6b09aa-e1b0-41c7-8aa0-e560de6310d5\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.167993 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmnwx\" (UniqueName: \"kubernetes.io/projected/76de952b-76db-47de-8891-40006493cf30-kube-api-access-fmnwx\") pod \"manila-operator-controller-manager-67d996989d-zfhwz\" (UID: \"76de952b-76db-47de-8891-40006493cf30\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.174008 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.175694 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.177974 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-2wq7h" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.191403 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkx5j\" (UniqueName: \"kubernetes.io/projected/1e6b09aa-e1b0-41c7-8aa0-e560de6310d5-kube-api-access-jkx5j\") pod \"keystone-operator-controller-manager-b4d948c87-4wg4t\" (UID: \"1e6b09aa-e1b0-41c7-8aa0-e560de6310d5\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.192492 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmnwx\" (UniqueName: \"kubernetes.io/projected/76de952b-76db-47de-8891-40006493cf30-kube-api-access-fmnwx\") pod \"manila-operator-controller-manager-67d996989d-zfhwz\" (UID: \"76de952b-76db-47de-8891-40006493cf30\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.200954 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqk7\" (UniqueName: \"kubernetes.io/projected/28697286-96cb-46ad-a4a5-acc3716aba31-kube-api-access-rgqk7\") pod \"mariadb-operator-controller-manager-6994f66f48-x8wnp\" (UID: \"28697286-96cb-46ad-a4a5-acc3716aba31\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.204089 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.215363 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.232963 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.270428 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.270481 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f44z6\" (UniqueName: \"kubernetes.io/projected/2e53de05-a35e-4ca4-9776-1492c5030554-kube-api-access-f44z6\") pod \"placement-operator-controller-manager-8497b45c89-fgn4m\" (UID: \"2e53de05-a35e-4ca4-9776-1492c5030554\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.270512 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4b4b\" (UniqueName: \"kubernetes.io/projected/c32453ad-27be-4f95-bfc1-67878c36f13a-kube-api-access-x4b4b\") pod \"nova-operator-controller-manager-567668f5cf-pfq48\" (UID: \"c32453ad-27be-4f95-bfc1-67878c36f13a\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.270529 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f58h7\" (UniqueName: \"kubernetes.io/projected/9972ea1a-4a28-4b7f-b511-9dd8dd3e0599-kube-api-access-f58h7\") pod \"ovn-operator-controller-manager-5955d8c787-vx95x\" (UID: \"9972ea1a-4a28-4b7f-b511-9dd8dd3e0599\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.270578 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswv2\" (UniqueName: \"kubernetes.io/projected/c6de99a3-3c54-4192-8cf6-fab2c5c9750b-kube-api-access-sswv2\") pod \"octavia-operator-controller-manager-659dc6bbfc-6d6fj\" (UID: \"c6de99a3-3c54-4192-8cf6-fab2c5c9750b\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.270610 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8z9f\" (UniqueName: \"kubernetes.io/projected/1ec01345-1480-48b1-9d36-9dd8a9fc2ef8-kube-api-access-g8z9f\") pod \"neutron-operator-controller-manager-6bd4687957-4g4xh\" (UID: \"1ec01345-1480-48b1-9d36-9dd8a9fc2ef8\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.270634 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q52s\" (UniqueName: \"kubernetes.io/projected/5fccc629-9a1d-4920-b3e7-817e49953fc1-kube-api-access-5q52s\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.281088 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-m65cb"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.283736 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.287888 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-7z9l5" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.291798 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58h7\" (UniqueName: \"kubernetes.io/projected/9972ea1a-4a28-4b7f-b511-9dd8dd3e0599-kube-api-access-f58h7\") pod \"ovn-operator-controller-manager-5955d8c787-vx95x\" (UID: \"9972ea1a-4a28-4b7f-b511-9dd8dd3e0599\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.296515 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.297713 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-m65cb"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.301387 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswv2\" (UniqueName: \"kubernetes.io/projected/c6de99a3-3c54-4192-8cf6-fab2c5c9750b-kube-api-access-sswv2\") pod \"octavia-operator-controller-manager-659dc6bbfc-6d6fj\" (UID: \"c6de99a3-3c54-4192-8cf6-fab2c5c9750b\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.303604 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.304273 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4b4b\" (UniqueName: \"kubernetes.io/projected/c32453ad-27be-4f95-bfc1-67878c36f13a-kube-api-access-x4b4b\") pod \"nova-operator-controller-manager-567668f5cf-pfq48\" (UID: \"c32453ad-27be-4f95-bfc1-67878c36f13a\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.313553 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.309690 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8z9f\" (UniqueName: \"kubernetes.io/projected/1ec01345-1480-48b1-9d36-9dd8a9fc2ef8-kube-api-access-g8z9f\") pod \"neutron-operator-controller-manager-6bd4687957-4g4xh\" (UID: \"1ec01345-1480-48b1-9d36-9dd8a9fc2ef8\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.317876 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.318869 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.323360 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-z42n9" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.331442 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.351168 4725 generic.go:334] "Generic (PLEG): container finished" podID="c0efcde2-60c3-4d14-bec9-056e06640cc6" containerID="f389f3767c97b06e8758ffb74af43d9d796a2c4122053af652e1aed939e58b65" exitCode=0 Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.351263 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" event={"ID":"c0efcde2-60c3-4d14-bec9-056e06640cc6","Type":"ContainerDied","Data":"f389f3767c97b06e8758ffb74af43d9d796a2c4122053af652e1aed939e58b65"} Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.357488 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.362630 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.377709 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.379060 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.379142 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert podName:5fccc629-9a1d-4920-b3e7-817e49953fc1 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:05.879119728 +0000 UTC m=+1064.341740297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" (UID: "5fccc629-9a1d-4920-b3e7-817e49953fc1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.379853 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f44z6\" (UniqueName: \"kubernetes.io/projected/2e53de05-a35e-4ca4-9776-1492c5030554-kube-api-access-f44z6\") pod \"placement-operator-controller-manager-8497b45c89-fgn4m\" (UID: \"2e53de05-a35e-4ca4-9776-1492c5030554\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.381876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q52s\" (UniqueName: \"kubernetes.io/projected/5fccc629-9a1d-4920-b3e7-817e49953fc1-kube-api-access-5q52s\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.381924 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4wd\" (UniqueName: \"kubernetes.io/projected/4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d-kube-api-access-vg4wd\") pod \"swift-operator-controller-manager-68f46476f-m65cb\" (UID: \"4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.400499 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f44z6\" (UniqueName: \"kubernetes.io/projected/2e53de05-a35e-4ca4-9776-1492c5030554-kube-api-access-f44z6\") pod \"placement-operator-controller-manager-8497b45c89-fgn4m\" (UID: \"2e53de05-a35e-4ca4-9776-1492c5030554\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.427182 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.427614 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q52s\" (UniqueName: \"kubernetes.io/projected/5fccc629-9a1d-4920-b3e7-817e49953fc1-kube-api-access-5q52s\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.427691 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.433206 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.437508 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.438014 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5tprv" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.461727 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.462563 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.463446 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.463915 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.466691 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5cksj" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.475953 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.476237 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.484642 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4wd\" (UniqueName: \"kubernetes.io/projected/4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d-kube-api-access-vg4wd\") pod \"swift-operator-controller-manager-68f46476f-m65cb\" (UID: \"4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.484687 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cslgz\" (UniqueName: \"kubernetes.io/projected/a9acda6b-5c71-406c-985e-c5e026b064c8-kube-api-access-cslgz\") pod \"telemetry-operator-controller-manager-589c568786-2sqrk\" (UID: \"a9acda6b-5c71-406c-985e-c5e026b064c8\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.498970 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.504221 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.505073 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.507377 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sw9v8" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.508213 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.510420 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4wd\" (UniqueName: \"kubernetes.io/projected/4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d-kube-api-access-vg4wd\") pod \"swift-operator-controller-manager-68f46476f-m65cb\" (UID: \"4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.512842 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.517416 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.524993 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.526001 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.527958 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-qvj7s" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.553072 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.586271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.586335 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.586357 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8928w\" (UniqueName: \"kubernetes.io/projected/59d481fc-2689-420f-b779-c7d840fac75d-kube-api-access-8928w\") pod \"watcher-operator-controller-manager-6c68576fd-g8db5\" (UID: \"59d481fc-2689-420f-b779-c7d840fac75d\") " pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.586406 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cslgz\" (UniqueName: \"kubernetes.io/projected/a9acda6b-5c71-406c-985e-c5e026b064c8-kube-api-access-cslgz\") pod \"telemetry-operator-controller-manager-589c568786-2sqrk\" (UID: \"a9acda6b-5c71-406c-985e-c5e026b064c8\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.586450 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44znj\" (UniqueName: \"kubernetes.io/projected/31b25662-0274-4176-b3fd-4edd98517298-kube-api-access-44znj\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.586473 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgfpv\" (UniqueName: \"kubernetes.io/projected/6f5e713b-cd6d-482f-8603-4dd47d2297d8-kube-api-access-xgfpv\") pod \"test-operator-controller-manager-5dc6794d5b-jp6zc\" (UID: \"6f5e713b-cd6d-482f-8603-4dd47d2297d8\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.586505 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.586624 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.586671 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert podName:119c1266-bd43-49d6-a39f-93abbf47c2be nodeName:}" failed. No retries permitted until 2026-02-27 06:28:06.586657644 +0000 UTC m=+1065.049278213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert") pod "infra-operator-controller-manager-79d975b745-llwd8" (UID: "119c1266-bd43-49d6-a39f-93abbf47c2be") : secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.605668 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.614558 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cslgz\" (UniqueName: \"kubernetes.io/projected/a9acda6b-5c71-406c-985e-c5e026b064c8-kube-api-access-cslgz\") pod \"telemetry-operator-controller-manager-589c568786-2sqrk\" (UID: \"a9acda6b-5c71-406c-985e-c5e026b064c8\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.625253 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" Feb 27 06:28:05 crc kubenswrapper[4725]: W0227 06:28:05.642130 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90d813f_86f2_49c9_b7d2_66d44db8236c.slice/crio-1ff3f8bae94f1c0ef5de8f6b286674eea1de6e03c689afd606d3badf08a26cbd WatchSource:0}: Error finding container 1ff3f8bae94f1c0ef5de8f6b286674eea1de6e03c689afd606d3badf08a26cbd: Status 404 returned error can't find the container with id 1ff3f8bae94f1c0ef5de8f6b286674eea1de6e03c689afd606d3badf08a26cbd Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.659273 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.663546 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.688653 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.688911 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.688897 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.688699 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.688994 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:06.188972895 +0000 UTC m=+1064.651593464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.689021 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8928w\" (UniqueName: \"kubernetes.io/projected/59d481fc-2689-420f-b779-c7d840fac75d-kube-api-access-8928w\") pod \"watcher-operator-controller-manager-6c68576fd-g8db5\" (UID: \"59d481fc-2689-420f-b779-c7d840fac75d\") " pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.689034 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:06.189010526 +0000 UTC m=+1064.651631095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "metrics-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.689194 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44znj\" (UniqueName: \"kubernetes.io/projected/31b25662-0274-4176-b3fd-4edd98517298-kube-api-access-44znj\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.689240 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgfpv\" (UniqueName: \"kubernetes.io/projected/6f5e713b-cd6d-482f-8603-4dd47d2297d8-kube-api-access-xgfpv\") pod \"test-operator-controller-manager-5dc6794d5b-jp6zc\" (UID: \"6f5e713b-cd6d-482f-8603-4dd47d2297d8\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.689349 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssfts\" (UniqueName: \"kubernetes.io/projected/96664d14-2465-472a-b6c6-5589153d5ee3-kube-api-access-ssfts\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zb72l\" (UID: \"96664d14-2465-472a-b6c6-5589153d5ee3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.708785 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8928w\" (UniqueName: \"kubernetes.io/projected/59d481fc-2689-420f-b779-c7d840fac75d-kube-api-access-8928w\") pod \"watcher-operator-controller-manager-6c68576fd-g8db5\" (UID: \"59d481fc-2689-420f-b779-c7d840fac75d\") " pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.716656 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgfpv\" (UniqueName: \"kubernetes.io/projected/6f5e713b-cd6d-482f-8603-4dd47d2297d8-kube-api-access-xgfpv\") pod \"test-operator-controller-manager-5dc6794d5b-jp6zc\" (UID: \"6f5e713b-cd6d-482f-8603-4dd47d2297d8\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.716746 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44znj\" (UniqueName: \"kubernetes.io/projected/31b25662-0274-4176-b3fd-4edd98517298-kube-api-access-44znj\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.782142 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.790024 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssfts\" (UniqueName: \"kubernetes.io/projected/96664d14-2465-472a-b6c6-5589153d5ee3-kube-api-access-ssfts\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zb72l\" (UID: \"96664d14-2465-472a-b6c6-5589153d5ee3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.804937 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.815514 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssfts\" (UniqueName: \"kubernetes.io/projected/96664d14-2465-472a-b6c6-5589153d5ee3-kube-api-access-ssfts\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zb72l\" (UID: \"96664d14-2465-472a-b6c6-5589153d5ee3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.875984 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.891140 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.891350 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: E0227 06:28:05.891428 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert podName:5fccc629-9a1d-4920-b3e7-817e49953fc1 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:06.891410297 +0000 UTC m=+1065.354030866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" (UID: "5fccc629-9a1d-4920-b3e7-817e49953fc1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.894866 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj"] Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.924099 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn"] Feb 27 06:28:05 crc kubenswrapper[4725]: W0227 06:28:05.935716 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod672a2ef1_a6d0_41f6_9bbf_5d157863ee48.slice/crio-45d3ddade1aecbab391e954387431ed7d15919390af4c3241f62ff8e6bbec268 WatchSource:0}: Error finding container 45d3ddade1aecbab391e954387431ed7d15919390af4c3241f62ff8e6bbec268: Status 404 returned error can't find the container with id 45d3ddade1aecbab391e954387431ed7d15919390af4c3241f62ff8e6bbec268 Feb 27 06:28:05 crc kubenswrapper[4725]: I0227 06:28:05.937054 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.119677 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.137525 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.143302 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh"] Feb 27 06:28:06 crc kubenswrapper[4725]: W0227 06:28:06.145658 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28697286_96cb_46ad_a4a5_acc3716aba31.slice/crio-919eb28b3102be877ac8bcd923a20997c1a8b08b5419fab0df02441e05c98c0a WatchSource:0}: Error finding container 919eb28b3102be877ac8bcd923a20997c1a8b08b5419fab0df02441e05c98c0a: Status 404 returned error can't find the container with id 919eb28b3102be877ac8bcd923a20997c1a8b08b5419fab0df02441e05c98c0a Feb 27 06:28:06 crc kubenswrapper[4725]: W0227 06:28:06.149940 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec01345_1480_48b1_9d36_9dd8a9fc2ef8.slice/crio-4e8fb44652ebe6c32af1012a07f23db0e6a4bed9203c476cdce2709729a8d91a WatchSource:0}: Error finding container 4e8fb44652ebe6c32af1012a07f23db0e6a4bed9203c476cdce2709729a8d91a: Status 404 returned error can't find the container with id 4e8fb44652ebe6c32af1012a07f23db0e6a4bed9203c476cdce2709729a8d91a Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.167764 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t"] Feb 27 06:28:06 crc kubenswrapper[4725]: W0227 06:28:06.173738 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6b09aa_e1b0_41c7_8aa0_e560de6310d5.slice/crio-a78a779b6e34b20a945443e49b77a92769b4dd7cb7b11bd60da9b32956453fd9 WatchSource:0}: Error finding container a78a779b6e34b20a945443e49b77a92769b4dd7cb7b11bd60da9b32956453fd9: Status 404 returned error can't find the container with id a78a779b6e34b20a945443e49b77a92769b4dd7cb7b11bd60da9b32956453fd9 Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.202063 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.202109 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.202274 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.202343 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:07.202327394 +0000 UTC m=+1065.664947963 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "webhook-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.202762 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.202793 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:07.202785977 +0000 UTC m=+1065.665406546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "metrics-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.267465 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2741416a-3f8b-4644-8d09-127648003383" path="/var/lib/kubelet/pods/2741416a-3f8b-4644-8d09-127648003383/volumes" Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.324858 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm"] Feb 27 06:28:06 crc kubenswrapper[4725]: W0227 06:28:06.335446 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1fefb43_64d1_496a_be4b_042d68027526.slice/crio-e90ec693f4ef1a78c00a106d69d0beff92b9e639e295b90f04e13d3f7cb4ebc0 WatchSource:0}: Error finding container e90ec693f4ef1a78c00a106d69d0beff92b9e639e295b90f04e13d3f7cb4ebc0: Status 404 returned error can't find the container with id e90ec693f4ef1a78c00a106d69d0beff92b9e639e295b90f04e13d3f7cb4ebc0 Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.339426 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.348323 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48"] Feb 27 06:28:06 crc kubenswrapper[4725]: W0227 06:28:06.352009 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32453ad_27be_4f95_bfc1_67878c36f13a.slice/crio-3cc009816e409b7f54c9fca39b8d997d5010bf5c97df2e7af70b53399b04e01a WatchSource:0}: Error finding container 3cc009816e409b7f54c9fca39b8d997d5010bf5c97df2e7af70b53399b04e01a: Status 404 returned error can't find the container with id 3cc009816e409b7f54c9fca39b8d997d5010bf5c97df2e7af70b53399b04e01a Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.356721 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.405271 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" event={"ID":"6e80b5f0-45bb-4081-808e-800527949f7e","Type":"ContainerStarted","Data":"a2370c85361e44aa2de65cc4a6c4762ce5c74641d1f216b62dd0fdbced1b93de"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.427229 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" event={"ID":"55b7330d-fa67-491c-9354-3ae2f377b245","Type":"ContainerStarted","Data":"c671fb55c746c9bd02a47ca20f7fe52646590c1f06eef3431188e11af0cac460"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.480791 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" event={"ID":"672a2ef1-a6d0-41f6-9bbf-5d157863ee48","Type":"ContainerStarted","Data":"45d3ddade1aecbab391e954387431ed7d15919390af4c3241f62ff8e6bbec268"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.512629 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" event={"ID":"28697286-96cb-46ad-a4a5-acc3716aba31","Type":"ContainerStarted","Data":"919eb28b3102be877ac8bcd923a20997c1a8b08b5419fab0df02441e05c98c0a"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.537195 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.542455 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-m65cb"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.545971 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.552024 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.556800 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk"] Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.563634 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" event={"ID":"246fa0fd-dd91-4c17-9754-8ed71768660a","Type":"ContainerStarted","Data":"18879e6d8f04efe0f083b3ded7757c73ebc4f5dd7577dab2455bc6a59a7f7937"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.571945 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5"] Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.578433 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sswv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-6d6fj_openstack-operators(c6de99a3-3c54-4192-8cf6-fab2c5c9750b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.579609 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" podUID="c6de99a3-3c54-4192-8cf6-fab2c5c9750b" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.583198 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vg4wd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-m65cb_openstack-operators(4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.586329 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" event={"ID":"f1fefb43-64d1-496a-be4b-042d68027526","Type":"ContainerStarted","Data":"e90ec693f4ef1a78c00a106d69d0beff92b9e639e295b90f04e13d3f7cb4ebc0"} Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.586340 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" podUID="4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d" Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.591473 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" event={"ID":"a90d813f-86f2-49c9-b7d2-66d44db8236c","Type":"ContainerStarted","Data":"1ff3f8bae94f1c0ef5de8f6b286674eea1de6e03c689afd606d3badf08a26cbd"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.592588 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" event={"ID":"1ec01345-1480-48b1-9d36-9dd8a9fc2ef8","Type":"ContainerStarted","Data":"4e8fb44652ebe6c32af1012a07f23db0e6a4bed9203c476cdce2709729a8d91a"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.593548 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" event={"ID":"9eeeac0e-6f80-4882-8d61-effa2342d69b","Type":"ContainerStarted","Data":"fb5f8b0b020405ab796242218873dedf9fc6403619198833f67ee3e16c203c52"} Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.594644 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" event={"ID":"1e6b09aa-e1b0-41c7-8aa0-e560de6310d5","Type":"ContainerStarted","Data":"a78a779b6e34b20a945443e49b77a92769b4dd7cb7b11bd60da9b32956453fd9"} Feb 27 06:28:06 crc kubenswrapper[4725]: W0227 06:28:06.603116 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9acda6b_5c71_406c_985e_c5e026b064c8.slice/crio-248f1ba07bb3c52f3493b021cb44713aa3d97ab3e6d558f5be04ce3e735ec007 WatchSource:0}: Error finding container 248f1ba07bb3c52f3493b021cb44713aa3d97ab3e6d558f5be04ce3e735ec007: Status 404 returned error can't find the container with id 248f1ba07bb3c52f3493b021cb44713aa3d97ab3e6d558f5be04ce3e735ec007 Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.610113 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.610363 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.610418 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert podName:119c1266-bd43-49d6-a39f-93abbf47c2be nodeName:}" failed. No retries permitted until 2026-02-27 06:28:08.610399787 +0000 UTC m=+1067.073020356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert") pod "infra-operator-controller-manager-79d975b745-llwd8" (UID: "119c1266-bd43-49d6-a39f-93abbf47c2be") : secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: W0227 06:28:06.616814 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f5e713b_cd6d_482f_8603_4dd47d2297d8.slice/crio-075bf56e49d21bae55c4c5da17a83bbdae5a0d2f154050b802f39cbc51409335 WatchSource:0}: Error finding container 075bf56e49d21bae55c4c5da17a83bbdae5a0d2f154050b802f39cbc51409335: Status 404 returned error can't find the container with id 075bf56e49d21bae55c4c5da17a83bbdae5a0d2f154050b802f39cbc51409335 Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.617815 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cslgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-589c568786-2sqrk_openstack-operators(a9acda6b-5c71-406c-985e-c5e026b064c8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.618241 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xgfpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-jp6zc_openstack-operators(6f5e713b-cd6d-482f-8603-4dd47d2297d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.618276 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l"] Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.619430 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" podUID="a9acda6b-5c71-406c-985e-c5e026b064c8" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.619470 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" podUID="6f5e713b-cd6d-482f-8603-4dd47d2297d8" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.648654 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.203:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8928w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c68576fd-g8db5_openstack-operators(59d481fc-2689-420f-b779-c7d840fac75d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.649482 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ssfts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-zb72l_openstack-operators(96664d14-2465-472a-b6c6-5589153d5ee3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.650487 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" podUID="59d481fc-2689-420f-b779-c7d840fac75d" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.650614 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" podUID="96664d14-2465-472a-b6c6-5589153d5ee3" Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.914470 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.914859 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: E0227 06:28:06.914929 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert podName:5fccc629-9a1d-4920-b3e7-817e49953fc1 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:08.914906913 +0000 UTC m=+1067.377527482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" (UID: "5fccc629-9a1d-4920-b3e7-817e49953fc1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:06 crc kubenswrapper[4725]: I0227 06:28:06.918847 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.016122 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhgv8\" (UniqueName: \"kubernetes.io/projected/c0efcde2-60c3-4d14-bec9-056e06640cc6-kube-api-access-bhgv8\") pod \"c0efcde2-60c3-4d14-bec9-056e06640cc6\" (UID: \"c0efcde2-60c3-4d14-bec9-056e06640cc6\") " Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.020773 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0efcde2-60c3-4d14-bec9-056e06640cc6-kube-api-access-bhgv8" (OuterVolumeSpecName: "kube-api-access-bhgv8") pod "c0efcde2-60c3-4d14-bec9-056e06640cc6" (UID: "c0efcde2-60c3-4d14-bec9-056e06640cc6"). InnerVolumeSpecName "kube-api-access-bhgv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.117745 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhgv8\" (UniqueName: \"kubernetes.io/projected/c0efcde2-60c3-4d14-bec9-056e06640cc6-kube-api-access-bhgv8\") on node \"crc\" DevicePath \"\"" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.219356 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.219418 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.219544 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.219596 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.219620 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:09.219602674 +0000 UTC m=+1067.682223243 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "metrics-server-cert" not found Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.219648 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:09.219633125 +0000 UTC m=+1067.682253684 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "webhook-server-cert" not found Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.388715 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536222-m8lnm"] Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.393510 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536222-m8lnm"] Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.625888 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" event={"ID":"c6de99a3-3c54-4192-8cf6-fab2c5c9750b","Type":"ContainerStarted","Data":"251195f2b662dd02c2b7af90528492365f92df6e2a1f040ef8173d7a78c0ea32"} Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.627771 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" podUID="c6de99a3-3c54-4192-8cf6-fab2c5c9750b" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.631067 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" event={"ID":"96664d14-2465-472a-b6c6-5589153d5ee3","Type":"ContainerStarted","Data":"dacdebe1a3e734a5c760ba0211a06e3652721325fcf4159ce82409355f8c073e"} Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.636097 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" podUID="96664d14-2465-472a-b6c6-5589153d5ee3" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.657564 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" event={"ID":"9972ea1a-4a28-4b7f-b511-9dd8dd3e0599","Type":"ContainerStarted","Data":"2fb425a038bb0106690931adad12a1a2f382f1a8f943fcb309026a71fedbd1dc"} Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.663665 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" event={"ID":"a9acda6b-5c71-406c-985e-c5e026b064c8","Type":"ContainerStarted","Data":"248f1ba07bb3c52f3493b021cb44713aa3d97ab3e6d558f5be04ce3e735ec007"} Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.666948 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" podUID="a9acda6b-5c71-406c-985e-c5e026b064c8" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.667600 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" event={"ID":"6f5e713b-cd6d-482f-8603-4dd47d2297d8","Type":"ContainerStarted","Data":"075bf56e49d21bae55c4c5da17a83bbdae5a0d2f154050b802f39cbc51409335"} Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.669188 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" podUID="6f5e713b-cd6d-482f-8603-4dd47d2297d8" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.669756 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" event={"ID":"76de952b-76db-47de-8891-40006493cf30","Type":"ContainerStarted","Data":"28fc1d563d2092529ad15e0e2592f33015619f1b0d9c62fa5142beaa7563536e"} Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.680136 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" event={"ID":"59d481fc-2689-420f-b779-c7d840fac75d","Type":"ContainerStarted","Data":"bf735a7a64a95bffb6005937bbd2ab06b8c236af9561b6888ef99cc09cb65ac4"} Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.681912 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" podUID="59d481fc-2689-420f-b779-c7d840fac75d" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.682153 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" event={"ID":"c32453ad-27be-4f95-bfc1-67878c36f13a","Type":"ContainerStarted","Data":"3cc009816e409b7f54c9fca39b8d997d5010bf5c97df2e7af70b53399b04e01a"} Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.683258 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" event={"ID":"4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d","Type":"ContainerStarted","Data":"64f076a45d64e50e48696db9ab06967018cd30a13d7d283db9f0dcb888d0db2e"} Feb 27 06:28:07 crc kubenswrapper[4725]: E0227 06:28:07.685068 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" podUID="4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.685633 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" event={"ID":"c0efcde2-60c3-4d14-bec9-056e06640cc6","Type":"ContainerDied","Data":"a181bccb818a970d80a43c0778a128cd21b5b339de36d5d00416da51441d7721"} Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.685656 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a181bccb818a970d80a43c0778a128cd21b5b339de36d5d00416da51441d7721" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.685700 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536228-jrqtv" Feb 27 06:28:07 crc kubenswrapper[4725]: I0227 06:28:07.692865 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" event={"ID":"2e53de05-a35e-4ca4-9776-1492c5030554","Type":"ContainerStarted","Data":"2328da9f3e69c01b26631680e24e432ec44ab590023df601c5f9b0d86cb259d5"} Feb 27 06:28:08 crc kubenswrapper[4725]: I0227 06:28:08.265249 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5f726c-3d3b-4b48-9ce9-4ce76e329edf" path="/var/lib/kubelet/pods/be5f726c-3d3b-4b48-9ce9-4ce76e329edf/volumes" Feb 27 06:28:08 crc kubenswrapper[4725]: I0227 06:28:08.646650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.646836 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.646923 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert podName:119c1266-bd43-49d6-a39f-93abbf47c2be nodeName:}" failed. No retries permitted until 2026-02-27 06:28:12.646906144 +0000 UTC m=+1071.109526713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert") pod "infra-operator-controller-manager-79d975b745-llwd8" (UID: "119c1266-bd43-49d6-a39f-93abbf47c2be") : secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.704135 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" podUID="a9acda6b-5c71-406c-985e-c5e026b064c8" Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.704511 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/openstack-k8s-operators/watcher-operator:eaf82eeed7c641cca4b0e467ff9bfd7468ff8986\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" podUID="59d481fc-2689-420f-b779-c7d840fac75d" Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.704644 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" podUID="96664d14-2465-472a-b6c6-5589153d5ee3" Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.704690 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" podUID="4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d" Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.704903 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" podUID="c6de99a3-3c54-4192-8cf6-fab2c5c9750b" Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.704940 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" podUID="6f5e713b-cd6d-482f-8603-4dd47d2297d8" Feb 27 06:28:08 crc kubenswrapper[4725]: I0227 06:28:08.956851 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.957021 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:08 crc kubenswrapper[4725]: E0227 06:28:08.957101 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert podName:5fccc629-9a1d-4920-b3e7-817e49953fc1 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:12.95707905 +0000 UTC m=+1071.419699619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" (UID: "5fccc629-9a1d-4920-b3e7-817e49953fc1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:09 crc kubenswrapper[4725]: I0227 06:28:09.261184 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:09 crc kubenswrapper[4725]: I0227 06:28:09.261228 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:09 crc kubenswrapper[4725]: E0227 06:28:09.261370 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 06:28:09 crc kubenswrapper[4725]: E0227 06:28:09.261397 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 06:28:09 crc kubenswrapper[4725]: E0227 06:28:09.261444 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:13.261424782 +0000 UTC m=+1071.724045361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "metrics-server-cert" not found Feb 27 06:28:09 crc kubenswrapper[4725]: E0227 06:28:09.261463 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:13.261453883 +0000 UTC m=+1071.724074462 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "webhook-server-cert" not found Feb 27 06:28:12 crc kubenswrapper[4725]: I0227 06:28:12.719795 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:12 crc kubenswrapper[4725]: E0227 06:28:12.719968 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:12 crc kubenswrapper[4725]: E0227 06:28:12.720382 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert podName:119c1266-bd43-49d6-a39f-93abbf47c2be nodeName:}" failed. No retries permitted until 2026-02-27 06:28:20.720363182 +0000 UTC m=+1079.182983751 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert") pod "infra-operator-controller-manager-79d975b745-llwd8" (UID: "119c1266-bd43-49d6-a39f-93abbf47c2be") : secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:13 crc kubenswrapper[4725]: I0227 06:28:13.024343 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:13 crc kubenswrapper[4725]: E0227 06:28:13.024564 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:13 crc kubenswrapper[4725]: E0227 06:28:13.024814 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert podName:5fccc629-9a1d-4920-b3e7-817e49953fc1 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:21.024797397 +0000 UTC m=+1079.487417966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" (UID: "5fccc629-9a1d-4920-b3e7-817e49953fc1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:13 crc kubenswrapper[4725]: I0227 06:28:13.333415 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:13 crc kubenswrapper[4725]: I0227 06:28:13.333519 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:13 crc kubenswrapper[4725]: E0227 06:28:13.333698 4725 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 06:28:13 crc kubenswrapper[4725]: E0227 06:28:13.333742 4725 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 06:28:13 crc kubenswrapper[4725]: E0227 06:28:13.333821 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:21.33379117 +0000 UTC m=+1079.796411779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "metrics-server-cert" not found Feb 27 06:28:13 crc kubenswrapper[4725]: E0227 06:28:13.333862 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs podName:31b25662-0274-4176-b3fd-4edd98517298 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:21.333844691 +0000 UTC m=+1079.796465300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs") pod "openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" (UID: "31b25662-0274-4176-b3fd-4edd98517298") : secret "webhook-server-cert" not found Feb 27 06:28:18 crc kubenswrapper[4725]: E0227 06:28:18.486148 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 27 06:28:18 crc kubenswrapper[4725]: E0227 06:28:18.486965 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x7wcv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-wc9zj_openstack-operators(9eeeac0e-6f80-4882-8d61-effa2342d69b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:28:18 crc kubenswrapper[4725]: E0227 06:28:18.488358 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" podUID="9eeeac0e-6f80-4882-8d61-effa2342d69b" Feb 27 06:28:18 crc kubenswrapper[4725]: E0227 06:28:18.818441 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" podUID="9eeeac0e-6f80-4882-8d61-effa2342d69b" Feb 27 06:28:20 crc kubenswrapper[4725]: I0227 06:28:20.775659 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:20 crc kubenswrapper[4725]: E0227 06:28:20.776116 4725 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:20 crc kubenswrapper[4725]: E0227 06:28:20.776174 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert podName:119c1266-bd43-49d6-a39f-93abbf47c2be nodeName:}" failed. No retries permitted until 2026-02-27 06:28:36.776156261 +0000 UTC m=+1095.238776840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert") pod "infra-operator-controller-manager-79d975b745-llwd8" (UID: "119c1266-bd43-49d6-a39f-93abbf47c2be") : secret "infra-operator-webhook-server-cert" not found Feb 27 06:28:21 crc kubenswrapper[4725]: I0227 06:28:21.081100 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:21 crc kubenswrapper[4725]: E0227 06:28:21.081272 4725 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:21 crc kubenswrapper[4725]: E0227 06:28:21.081400 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert podName:5fccc629-9a1d-4920-b3e7-817e49953fc1 nodeName:}" failed. No retries permitted until 2026-02-27 06:28:37.081379087 +0000 UTC m=+1095.543999666 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" (UID: "5fccc629-9a1d-4920-b3e7-817e49953fc1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 06:28:21 crc kubenswrapper[4725]: I0227 06:28:21.385835 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:21 crc kubenswrapper[4725]: I0227 06:28:21.386309 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:21 crc kubenswrapper[4725]: I0227 06:28:21.396399 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-webhook-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:21 crc kubenswrapper[4725]: I0227 06:28:21.399317 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31b25662-0274-4176-b3fd-4edd98517298-metrics-certs\") pod \"openstack-operator-controller-manager-5cb5b7b9c5-kwj9k\" (UID: \"31b25662-0274-4176-b3fd-4edd98517298\") " pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:21 crc kubenswrapper[4725]: I0227 06:28:21.440207 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:22 crc kubenswrapper[4725]: E0227 06:28:22.638335 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:8f06b9963e5b324856ce8ed80872cf04fdfb299d4f5cf13cb1d26f4e69ed42be" Feb 27 06:28:22 crc kubenswrapper[4725]: E0227 06:28:22.638833 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:8f06b9963e5b324856ce8ed80872cf04fdfb299d4f5cf13cb1d26f4e69ed42be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fffs8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-784b5bb6c5-48l7p_openstack-operators(55b7330d-fa67-491c-9354-3ae2f377b245): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:28:22 crc kubenswrapper[4725]: E0227 06:28:22.642658 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" podUID="55b7330d-fa67-491c-9354-3ae2f377b245" Feb 27 06:28:22 crc kubenswrapper[4725]: E0227 06:28:22.861663 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:8f06b9963e5b324856ce8ed80872cf04fdfb299d4f5cf13cb1d26f4e69ed42be\\\"\"" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" podUID="55b7330d-fa67-491c-9354-3ae2f377b245" Feb 27 06:28:23 crc kubenswrapper[4725]: E0227 06:28:23.184544 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26" Feb 27 06:28:23 crc kubenswrapper[4725]: E0227 06:28:23.184752 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fmnwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-zfhwz_openstack-operators(76de952b-76db-47de-8891-40006493cf30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:28:23 crc kubenswrapper[4725]: E0227 06:28:23.185960 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" podUID="76de952b-76db-47de-8891-40006493cf30" Feb 27 06:28:23 crc kubenswrapper[4725]: E0227 06:28:23.870563 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" podUID="76de952b-76db-47de-8891-40006493cf30" Feb 27 06:28:23 crc kubenswrapper[4725]: E0227 06:28:23.875820 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf" Feb 27 06:28:23 crc kubenswrapper[4725]: E0227 06:28:23.876064 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8z9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-4g4xh_openstack-operators(1ec01345-1480-48b1-9d36-9dd8a9fc2ef8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:28:23 crc kubenswrapper[4725]: E0227 06:28:23.877776 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" podUID="1ec01345-1480-48b1-9d36-9dd8a9fc2ef8" Feb 27 06:28:24 crc kubenswrapper[4725]: E0227 06:28:24.696738 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 27 06:28:24 crc kubenswrapper[4725]: E0227 06:28:24.696879 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x4b4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-pfq48_openstack-operators(c32453ad-27be-4f95-bfc1-67878c36f13a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:28:24 crc kubenswrapper[4725]: E0227 06:28:24.698683 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" podUID="c32453ad-27be-4f95-bfc1-67878c36f13a" Feb 27 06:28:24 crc kubenswrapper[4725]: E0227 06:28:24.873593 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" podUID="c32453ad-27be-4f95-bfc1-67878c36f13a" Feb 27 06:28:24 crc kubenswrapper[4725]: E0227 06:28:24.873686 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" podUID="1ec01345-1480-48b1-9d36-9dd8a9fc2ef8" Feb 27 06:28:25 crc kubenswrapper[4725]: E0227 06:28:25.352431 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 27 06:28:25 crc kubenswrapper[4725]: E0227 06:28:25.352643 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jkx5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-4wg4t_openstack-operators(1e6b09aa-e1b0-41c7-8aa0-e560de6310d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:28:25 crc kubenswrapper[4725]: E0227 06:28:25.353822 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" podUID="1e6b09aa-e1b0-41c7-8aa0-e560de6310d5" Feb 27 06:28:25 crc kubenswrapper[4725]: E0227 06:28:25.882588 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" podUID="1e6b09aa-e1b0-41c7-8aa0-e560de6310d5" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.041364 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k"] Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.950027 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" event={"ID":"28697286-96cb-46ad-a4a5-acc3716aba31","Type":"ContainerStarted","Data":"d07a8848008a06fa6e7fe25ebbab1c50aa788f5fd04476a264b98e12fa542f5e"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.950590 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.962632 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" event={"ID":"31b25662-0274-4176-b3fd-4edd98517298","Type":"ContainerStarted","Data":"a8be77ee9a3c0032aab4c10ecb0e4a5552b5d672ee56c37b69bd1fe13c68c26c"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.962690 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" event={"ID":"31b25662-0274-4176-b3fd-4edd98517298","Type":"ContainerStarted","Data":"1c108d2b7f2134f13dd5b119ba48eee2246960ae0fc54d84a9a2228e0ea24611"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.962733 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.969571 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" event={"ID":"9eeeac0e-6f80-4882-8d61-effa2342d69b","Type":"ContainerStarted","Data":"4a7c518bf57638350f390c361112863ba88962d5f65179d13961a53624d5c450"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.969803 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.972262 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" event={"ID":"59d481fc-2689-420f-b779-c7d840fac75d","Type":"ContainerStarted","Data":"ee33c683562ddc0422d500b3a560d2223f56d2d80c89f2d4707d4beb7bc38973"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.972781 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.974389 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" event={"ID":"c6de99a3-3c54-4192-8cf6-fab2c5c9750b","Type":"ContainerStarted","Data":"1f4f2f105c3534dfe2cfc0a0754701a00f07d6d0870a01459678d81a8098c450"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.974760 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.976155 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" event={"ID":"2e53de05-a35e-4ca4-9776-1492c5030554","Type":"ContainerStarted","Data":"f4a0db527925397552657d9cd276643071f8e2ef198440030c26f3709bbb291e"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.976497 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.977853 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" event={"ID":"4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d","Type":"ContainerStarted","Data":"a4ff0be218dbfd99441d6cefc0498febe6928eb7b32aa00bb7a5206e4d08fb62"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.978187 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.979172 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" event={"ID":"6e80b5f0-45bb-4081-808e-800527949f7e","Type":"ContainerStarted","Data":"feb8f0fd18d1ddd1c2f0f7055f51aa9a29e0fee000f0a08ad815fcc55eb74fcf"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.979506 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.980902 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" event={"ID":"672a2ef1-a6d0-41f6-9bbf-5d157863ee48","Type":"ContainerStarted","Data":"8ad011d6d203e7121a2b9022766951bee832426a622edb1d54cff97e135d8ef1"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.981223 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.982477 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" event={"ID":"246fa0fd-dd91-4c17-9754-8ed71768660a","Type":"ContainerStarted","Data":"9dc6433b9d448ac802d85c745c3ada22606fa706aae1b333718e542e84e82f37"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.982797 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.984437 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" event={"ID":"9972ea1a-4a28-4b7f-b511-9dd8dd3e0599","Type":"ContainerStarted","Data":"1caa864d101967d05b4973dbeeb7bed32f0be21b58faddb29184db4fc166ec5e"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.984777 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.985905 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" event={"ID":"a9acda6b-5c71-406c-985e-c5e026b064c8","Type":"ContainerStarted","Data":"0c13a25213f613e9205436ab6157184a4b9e4984cae83e321cb67bc0f2b0fcaf"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.986229 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.987986 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" event={"ID":"f1fefb43-64d1-496a-be4b-042d68027526","Type":"ContainerStarted","Data":"90cfecf2a8d65b2cd3b4bebacc48ee63b31e90a21204c228cbcfa1fe6ff14dd6"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.988187 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.989600 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" event={"ID":"6f5e713b-cd6d-482f-8603-4dd47d2297d8","Type":"ContainerStarted","Data":"c7efe470ff9a8825eb751f4c2efb7f3de4208eae5231e12ce613e81162a68390"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.989933 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.991148 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" event={"ID":"a90d813f-86f2-49c9-b7d2-66d44db8236c","Type":"ContainerStarted","Data":"c3bac57efbb158387b7ec6c4565b216b5f6ab5dbae7c62b04ff7bbc95fa89490"} Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.991498 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" Feb 27 06:28:31 crc kubenswrapper[4725]: I0227 06:28:31.992668 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" event={"ID":"96664d14-2465-472a-b6c6-5589153d5ee3","Type":"ContainerStarted","Data":"04ee5400e63f88c60ca4a917bf223b770c7921649498af372118dafb9b879c90"} Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.094150 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" podStartSLOduration=7.736877382 podStartE2EDuration="28.094123496s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.147794458 +0000 UTC m=+1064.610415027" lastFinishedPulling="2026-02-27 06:28:26.505040542 +0000 UTC m=+1084.967661141" observedRunningTime="2026-02-27 06:28:32.090987768 +0000 UTC m=+1090.553608337" watchObservedRunningTime="2026-02-27 06:28:32.094123496 +0000 UTC m=+1090.556744065" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.286328 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" podStartSLOduration=8.650423602 podStartE2EDuration="28.286310109s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:05.700737047 +0000 UTC m=+1064.163357616" lastFinishedPulling="2026-02-27 06:28:25.336623554 +0000 UTC m=+1083.799244123" observedRunningTime="2026-02-27 06:28:32.282930124 +0000 UTC m=+1090.745550693" watchObservedRunningTime="2026-02-27 06:28:32.286310109 +0000 UTC m=+1090.748930678" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.394136 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" podStartSLOduration=27.394114025 podStartE2EDuration="27.394114025s" podCreationTimestamp="2026-02-27 06:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:28:32.387686444 +0000 UTC m=+1090.850307023" watchObservedRunningTime="2026-02-27 06:28:32.394114025 +0000 UTC m=+1090.856734594" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.396475 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" podStartSLOduration=3.430916315 podStartE2EDuration="28.396465241s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:05.951079527 +0000 UTC m=+1064.413700096" lastFinishedPulling="2026-02-27 06:28:30.916628453 +0000 UTC m=+1089.379249022" observedRunningTime="2026-02-27 06:28:32.344098707 +0000 UTC m=+1090.806719276" watchObservedRunningTime="2026-02-27 06:28:32.396465241 +0000 UTC m=+1090.859085810" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.489114 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" podStartSLOduration=4.232867563 podStartE2EDuration="28.48909364s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.583042967 +0000 UTC m=+1065.045663536" lastFinishedPulling="2026-02-27 06:28:30.839269034 +0000 UTC m=+1089.301889613" observedRunningTime="2026-02-27 06:28:32.427209787 +0000 UTC m=+1090.889830356" watchObservedRunningTime="2026-02-27 06:28:32.48909364 +0000 UTC m=+1090.951714209" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.537050 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" podStartSLOduration=3.26224483 podStartE2EDuration="27.53702826s" podCreationTimestamp="2026-02-27 06:28:05 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.648457849 +0000 UTC m=+1065.111078418" lastFinishedPulling="2026-02-27 06:28:30.923241269 +0000 UTC m=+1089.385861848" observedRunningTime="2026-02-27 06:28:32.489975715 +0000 UTC m=+1090.952596304" watchObservedRunningTime="2026-02-27 06:28:32.53702826 +0000 UTC m=+1090.999648829" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.567176 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" podStartSLOduration=9.170855459 podStartE2EDuration="28.567159279s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:05.940389886 +0000 UTC m=+1064.403010465" lastFinishedPulling="2026-02-27 06:28:25.336693726 +0000 UTC m=+1083.799314285" observedRunningTime="2026-02-27 06:28:32.54623791 +0000 UTC m=+1091.008858479" watchObservedRunningTime="2026-02-27 06:28:32.567159279 +0000 UTC m=+1091.029779848" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.597072 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" podStartSLOduration=3.658622444 podStartE2EDuration="27.597056051s" podCreationTimestamp="2026-02-27 06:28:05 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.618139795 +0000 UTC m=+1065.080760354" lastFinishedPulling="2026-02-27 06:28:30.556573362 +0000 UTC m=+1089.019193961" observedRunningTime="2026-02-27 06:28:32.568654711 +0000 UTC m=+1091.031275280" watchObservedRunningTime="2026-02-27 06:28:32.597056051 +0000 UTC m=+1091.059676620" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.598578 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zb72l" podStartSLOduration=3.374387739 podStartE2EDuration="27.598573414s" podCreationTimestamp="2026-02-27 06:28:05 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.649329924 +0000 UTC m=+1065.111950493" lastFinishedPulling="2026-02-27 06:28:30.873515569 +0000 UTC m=+1089.336136168" observedRunningTime="2026-02-27 06:28:32.595375524 +0000 UTC m=+1091.057996093" watchObservedRunningTime="2026-02-27 06:28:32.598573414 +0000 UTC m=+1091.061193983" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.613085 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" podStartSLOduration=8.052028168 podStartE2EDuration="28.613066782s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:05.943809193 +0000 UTC m=+1064.406429762" lastFinishedPulling="2026-02-27 06:28:26.504847807 +0000 UTC m=+1084.967468376" observedRunningTime="2026-02-27 06:28:32.609628975 +0000 UTC m=+1091.072249544" watchObservedRunningTime="2026-02-27 06:28:32.613066782 +0000 UTC m=+1091.075687351" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.633044 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" podStartSLOduration=4.654722414 podStartE2EDuration="28.633025884s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.578277822 +0000 UTC m=+1065.040898391" lastFinishedPulling="2026-02-27 06:28:30.556581262 +0000 UTC m=+1089.019201861" observedRunningTime="2026-02-27 06:28:32.629796813 +0000 UTC m=+1091.092417382" watchObservedRunningTime="2026-02-27 06:28:32.633025884 +0000 UTC m=+1091.095646453" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.658709 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" podStartSLOduration=3.402946253 podStartE2EDuration="27.658693657s" podCreationTimestamp="2026-02-27 06:28:05 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.617707563 +0000 UTC m=+1065.080328132" lastFinishedPulling="2026-02-27 06:28:30.873454917 +0000 UTC m=+1089.336075536" observedRunningTime="2026-02-27 06:28:32.656500555 +0000 UTC m=+1091.119121134" watchObservedRunningTime="2026-02-27 06:28:32.658693657 +0000 UTC m=+1091.121314226" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.675955 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" podStartSLOduration=8.512376684 podStartE2EDuration="28.675940623s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.340936458 +0000 UTC m=+1064.803557027" lastFinishedPulling="2026-02-27 06:28:26.504500397 +0000 UTC m=+1084.967120966" observedRunningTime="2026-02-27 06:28:32.672027463 +0000 UTC m=+1091.134648032" watchObservedRunningTime="2026-02-27 06:28:32.675940623 +0000 UTC m=+1091.138561192" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.706686 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" podStartSLOduration=8.638400762 podStartE2EDuration="28.706667328s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.436505249 +0000 UTC m=+1064.899125818" lastFinishedPulling="2026-02-27 06:28:26.504771815 +0000 UTC m=+1084.967392384" observedRunningTime="2026-02-27 06:28:32.704149197 +0000 UTC m=+1091.166769766" watchObservedRunningTime="2026-02-27 06:28:32.706667328 +0000 UTC m=+1091.169287897" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.733052 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" podStartSLOduration=8.008356298 podStartE2EDuration="28.733034861s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:05.780197715 +0000 UTC m=+1064.242818284" lastFinishedPulling="2026-02-27 06:28:26.504876278 +0000 UTC m=+1084.967496847" observedRunningTime="2026-02-27 06:28:32.731787636 +0000 UTC m=+1091.194408205" watchObservedRunningTime="2026-02-27 06:28:32.733034861 +0000 UTC m=+1091.195655430" Feb 27 06:28:32 crc kubenswrapper[4725]: I0227 06:28:32.752794 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" podStartSLOduration=8.812077115 podStartE2EDuration="28.752776667s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.564229377 +0000 UTC m=+1065.026849946" lastFinishedPulling="2026-02-27 06:28:26.504928919 +0000 UTC m=+1084.967549498" observedRunningTime="2026-02-27 06:28:32.751300605 +0000 UTC m=+1091.213921184" watchObservedRunningTime="2026-02-27 06:28:32.752776667 +0000 UTC m=+1091.215397236" Feb 27 06:28:35 crc kubenswrapper[4725]: I0227 06:28:35.021596 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" event={"ID":"55b7330d-fa67-491c-9354-3ae2f377b245","Type":"ContainerStarted","Data":"4c5ae773cdc7419c1960b56193e18c8011991f0c20440d5e6eb4f9170f7a6d33"} Feb 27 06:28:35 crc kubenswrapper[4725]: I0227 06:28:35.023087 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" Feb 27 06:28:35 crc kubenswrapper[4725]: I0227 06:28:35.045831 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" podStartSLOduration=2.442612532 podStartE2EDuration="31.04580542s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.119954634 +0000 UTC m=+1064.582575203" lastFinishedPulling="2026-02-27 06:28:34.723147522 +0000 UTC m=+1093.185768091" observedRunningTime="2026-02-27 06:28:35.036059606 +0000 UTC m=+1093.498680205" watchObservedRunningTime="2026-02-27 06:28:35.04580542 +0000 UTC m=+1093.508426019" Feb 27 06:28:36 crc kubenswrapper[4725]: I0227 06:28:36.035864 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" event={"ID":"1ec01345-1480-48b1-9d36-9dd8a9fc2ef8","Type":"ContainerStarted","Data":"332d467be2b527a6813865627466a3954392e0218df2a5e2e25bb6a2fac90c17"} Feb 27 06:28:36 crc kubenswrapper[4725]: I0227 06:28:36.036621 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" Feb 27 06:28:36 crc kubenswrapper[4725]: I0227 06:28:36.061142 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" podStartSLOduration=2.5338156400000003 podStartE2EDuration="32.061122915s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.158696075 +0000 UTC m=+1064.621316644" lastFinishedPulling="2026-02-27 06:28:35.68600335 +0000 UTC m=+1094.148623919" observedRunningTime="2026-02-27 06:28:36.055427655 +0000 UTC m=+1094.518048284" watchObservedRunningTime="2026-02-27 06:28:36.061122915 +0000 UTC m=+1094.523743484" Feb 27 06:28:36 crc kubenswrapper[4725]: I0227 06:28:36.858525 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:36 crc kubenswrapper[4725]: I0227 06:28:36.867869 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/119c1266-bd43-49d6-a39f-93abbf47c2be-cert\") pod \"infra-operator-controller-manager-79d975b745-llwd8\" (UID: \"119c1266-bd43-49d6-a39f-93abbf47c2be\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:36 crc kubenswrapper[4725]: I0227 06:28:36.955630 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vwl4f" Feb 27 06:28:36 crc kubenswrapper[4725]: I0227 06:28:36.963446 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:37 crc kubenswrapper[4725]: I0227 06:28:37.162639 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:37 crc kubenswrapper[4725]: I0227 06:28:37.169327 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fccc629-9a1d-4920-b3e7-817e49953fc1-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4\" (UID: \"5fccc629-9a1d-4920-b3e7-817e49953fc1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:37 crc kubenswrapper[4725]: I0227 06:28:37.293152 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-drgm4" Feb 27 06:28:37 crc kubenswrapper[4725]: I0227 06:28:37.297096 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:37 crc kubenswrapper[4725]: W0227 06:28:37.324458 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119c1266_bd43_49d6_a39f_93abbf47c2be.slice/crio-4526d657ecf01aa1628861dfe0c5f78b1b5c4f95c67b4dfff5ec528cc7d05e22 WatchSource:0}: Error finding container 4526d657ecf01aa1628861dfe0c5f78b1b5c4f95c67b4dfff5ec528cc7d05e22: Status 404 returned error can't find the container with id 4526d657ecf01aa1628861dfe0c5f78b1b5c4f95c67b4dfff5ec528cc7d05e22 Feb 27 06:28:37 crc kubenswrapper[4725]: I0227 06:28:37.330222 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-llwd8"] Feb 27 06:28:37 crc kubenswrapper[4725]: I0227 06:28:37.543234 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4"] Feb 27 06:28:37 crc kubenswrapper[4725]: W0227 06:28:37.555237 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fccc629_9a1d_4920_b3e7_817e49953fc1.slice/crio-596fd626acbaade33464b3d662d91ac6dce72c66e6f13581289956dabd22ae38 WatchSource:0}: Error finding container 596fd626acbaade33464b3d662d91ac6dce72c66e6f13581289956dabd22ae38: Status 404 returned error can't find the container with id 596fd626acbaade33464b3d662d91ac6dce72c66e6f13581289956dabd22ae38 Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.056251 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" event={"ID":"119c1266-bd43-49d6-a39f-93abbf47c2be","Type":"ContainerStarted","Data":"4526d657ecf01aa1628861dfe0c5f78b1b5c4f95c67b4dfff5ec528cc7d05e22"} Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.058901 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" event={"ID":"1e6b09aa-e1b0-41c7-8aa0-e560de6310d5","Type":"ContainerStarted","Data":"9576faf9305d2ea31c3fb80237ddc6587d3f008d74f4cb4132ef3b8c356efb61"} Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.059190 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.060506 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" event={"ID":"5fccc629-9a1d-4920-b3e7-817e49953fc1","Type":"ContainerStarted","Data":"596fd626acbaade33464b3d662d91ac6dce72c66e6f13581289956dabd22ae38"} Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.062697 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" event={"ID":"76de952b-76db-47de-8891-40006493cf30","Type":"ContainerStarted","Data":"4fa401217ffd881be659fb08e047c6b49b81541ce41b2f27f3bdc5f308f54381"} Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.062888 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.099921 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" podStartSLOduration=2.527051429 podStartE2EDuration="34.099905757s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.177845334 +0000 UTC m=+1064.640465903" lastFinishedPulling="2026-02-27 06:28:37.750699632 +0000 UTC m=+1096.213320231" observedRunningTime="2026-02-27 06:28:38.079728839 +0000 UTC m=+1096.542349428" watchObservedRunningTime="2026-02-27 06:28:38.099905757 +0000 UTC m=+1096.562526326" Feb 27 06:28:38 crc kubenswrapper[4725]: I0227 06:28:38.100275 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" podStartSLOduration=2.836851905 podStartE2EDuration="34.100269388s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.48092228 +0000 UTC m=+1064.943542849" lastFinishedPulling="2026-02-27 06:28:37.744339763 +0000 UTC m=+1096.206960332" observedRunningTime="2026-02-27 06:28:38.097751807 +0000 UTC m=+1096.560372406" watchObservedRunningTime="2026-02-27 06:28:38.100269388 +0000 UTC m=+1096.562889957" Feb 27 06:28:39 crc kubenswrapper[4725]: I0227 06:28:39.834147 4725 scope.go:117] "RemoveContainer" containerID="9a6d2e987fbb19bf18c1c83af76ab8a75f1430f9a9e2ecfb2bd70b7b1a1ce096" Feb 27 06:28:41 crc kubenswrapper[4725]: I0227 06:28:41.448129 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5cb5b7b9c5-kwj9k" Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.104206 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" event={"ID":"5fccc629-9a1d-4920-b3e7-817e49953fc1","Type":"ContainerStarted","Data":"186756a907617c1fa61f110dc299ff9e162a8c7c6f5287cba38023f23ffc16ed"} Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.105247 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.107990 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" event={"ID":"119c1266-bd43-49d6-a39f-93abbf47c2be","Type":"ContainerStarted","Data":"a7340f1c3b3d5fdbcb9ea3d52b0e2a5f91129f8fafbc07f92bb0ebf8715b850d"} Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.108421 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.111245 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" event={"ID":"c32453ad-27be-4f95-bfc1-67878c36f13a","Type":"ContainerStarted","Data":"0eae9c37828e198068123bcc81349b375db8f91d156277b912ac64e7a805413e"} Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.111560 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.164848 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" podStartSLOduration=34.776123383 podStartE2EDuration="38.164820414s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:37.556792261 +0000 UTC m=+1096.019412830" lastFinishedPulling="2026-02-27 06:28:40.945489252 +0000 UTC m=+1099.408109861" observedRunningTime="2026-02-27 06:28:42.163267341 +0000 UTC m=+1100.625887940" watchObservedRunningTime="2026-02-27 06:28:42.164820414 +0000 UTC m=+1100.627441023" Feb 27 06:28:42 crc kubenswrapper[4725]: I0227 06:28:42.189561 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" podStartSLOduration=5.892659081 podStartE2EDuration="38.189542551s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:06.392632744 +0000 UTC m=+1064.855253313" lastFinishedPulling="2026-02-27 06:28:38.689516214 +0000 UTC m=+1097.152136783" observedRunningTime="2026-02-27 06:28:42.186464204 +0000 UTC m=+1100.649084783" watchObservedRunningTime="2026-02-27 06:28:42.189542551 +0000 UTC m=+1100.652163130" Feb 27 06:28:44 crc kubenswrapper[4725]: I0227 06:28:44.971897 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-6sd5f" Feb 27 06:28:44 crc kubenswrapper[4725]: I0227 06:28:44.990260 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-pl57b" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.041485 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" podStartSLOduration=37.432145555 podStartE2EDuration="41.04145683s" podCreationTimestamp="2026-02-27 06:28:04 +0000 UTC" firstStartedPulling="2026-02-27 06:28:37.336191177 +0000 UTC m=+1095.798811746" lastFinishedPulling="2026-02-27 06:28:40.945502412 +0000 UTC m=+1099.408123021" observedRunningTime="2026-02-27 06:28:42.219853364 +0000 UTC m=+1100.682473953" watchObservedRunningTime="2026-02-27 06:28:45.04145683 +0000 UTC m=+1103.504077409" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.061589 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-pklxn" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.154142 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jjgd6" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.206378 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-wc9zj" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.299110 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-4wg4t" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.305946 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xkffm" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.316758 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-48l7p" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.359961 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-zfhwz" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.365954 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-x8wnp" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.430496 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-4g4xh" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.465891 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-6d6fj" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.486191 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-vx95x" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.501918 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fgn4m" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.627834 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-m65cb" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.712019 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-2sqrk" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.784515 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-jp6zc" Feb 27 06:28:45 crc kubenswrapper[4725]: I0227 06:28:45.808802 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c68576fd-g8db5" Feb 27 06:28:46 crc kubenswrapper[4725]: I0227 06:28:46.973308 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-llwd8" Feb 27 06:28:47 crc kubenswrapper[4725]: I0227 06:28:47.307447 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4" Feb 27 06:28:55 crc kubenswrapper[4725]: I0227 06:28:55.464913 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-pfq48" Feb 27 06:29:02 crc kubenswrapper[4725]: I0227 06:29:02.555171 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:29:02 crc kubenswrapper[4725]: I0227 06:29:02.555996 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.105212 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688fdbb689-swt8n"] Feb 27 06:29:15 crc kubenswrapper[4725]: E0227 06:29:15.106015 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0efcde2-60c3-4d14-bec9-056e06640cc6" containerName="oc" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.106028 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0efcde2-60c3-4d14-bec9-056e06640cc6" containerName="oc" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.106179 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0efcde2-60c3-4d14-bec9-056e06640cc6" containerName="oc" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.106897 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.110683 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.110982 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.110851 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zdc9q" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.110889 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.121552 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688fdbb689-swt8n"] Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.164613 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75f8b56f9c-jhj7s"] Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.166483 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.169373 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.178586 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75f8b56f9c-jhj7s"] Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.227300 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-config\") pod \"dnsmasq-dns-688fdbb689-swt8n\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.227395 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmw9b\" (UniqueName: \"kubernetes.io/projected/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-kube-api-access-gmw9b\") pod \"dnsmasq-dns-688fdbb689-swt8n\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.328632 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-config\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.328697 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-config\") pod \"dnsmasq-dns-688fdbb689-swt8n\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.328734 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmw9b\" (UniqueName: \"kubernetes.io/projected/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-kube-api-access-gmw9b\") pod \"dnsmasq-dns-688fdbb689-swt8n\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.328758 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-dns-svc\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.328786 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727bh\" (UniqueName: \"kubernetes.io/projected/cc99abc2-145a-4d33-b04c-f5293b264df0-kube-api-access-727bh\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.350249 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-config\") pod \"dnsmasq-dns-688fdbb689-swt8n\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.354482 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmw9b\" (UniqueName: \"kubernetes.io/projected/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-kube-api-access-gmw9b\") pod \"dnsmasq-dns-688fdbb689-swt8n\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.428764 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.429455 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-config\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.429571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-dns-svc\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.429608 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727bh\" (UniqueName: \"kubernetes.io/projected/cc99abc2-145a-4d33-b04c-f5293b264df0-kube-api-access-727bh\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.430934 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-dns-svc\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.436388 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-config\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.450473 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727bh\" (UniqueName: \"kubernetes.io/projected/cc99abc2-145a-4d33-b04c-f5293b264df0-kube-api-access-727bh\") pod \"dnsmasq-dns-75f8b56f9c-jhj7s\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.493918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:15 crc kubenswrapper[4725]: I0227 06:29:15.963258 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688fdbb689-swt8n"] Feb 27 06:29:16 crc kubenswrapper[4725]: I0227 06:29:16.009898 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75f8b56f9c-jhj7s"] Feb 27 06:29:16 crc kubenswrapper[4725]: W0227 06:29:16.014655 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc99abc2_145a_4d33_b04c_f5293b264df0.slice/crio-3453e470796e8e7371d2febf2cb32205d53b91aefcb3772f647ef80b3c692126 WatchSource:0}: Error finding container 3453e470796e8e7371d2febf2cb32205d53b91aefcb3772f647ef80b3c692126: Status 404 returned error can't find the container with id 3453e470796e8e7371d2febf2cb32205d53b91aefcb3772f647ef80b3c692126 Feb 27 06:29:16 crc kubenswrapper[4725]: I0227 06:29:16.420319 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688fdbb689-swt8n" event={"ID":"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4","Type":"ContainerStarted","Data":"bf945f6f474e8d0143082c31e1425392ff8b3e2b10c9e873b91fd7b20716b408"} Feb 27 06:29:16 crc kubenswrapper[4725]: I0227 06:29:16.421582 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" event={"ID":"cc99abc2-145a-4d33-b04c-f5293b264df0","Type":"ContainerStarted","Data":"3453e470796e8e7371d2febf2cb32205d53b91aefcb3772f647ef80b3c692126"} Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.642099 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688fdbb689-swt8n"] Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.679965 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d87c9cc-fv2k9"] Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.681162 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.689902 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d87c9cc-fv2k9"] Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.782812 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-config\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.783078 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-dns-svc\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.783154 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqjfj\" (UniqueName: \"kubernetes.io/projected/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-kube-api-access-bqjfj\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.884920 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-dns-svc\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.885210 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqjfj\" (UniqueName: \"kubernetes.io/projected/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-kube-api-access-bqjfj\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.885239 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-config\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.885947 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-config\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.886019 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-dns-svc\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.915772 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqjfj\" (UniqueName: \"kubernetes.io/projected/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-kube-api-access-bqjfj\") pod \"dnsmasq-dns-568d87c9cc-fv2k9\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.932116 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f8b56f9c-jhj7s"] Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.951691 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57667b4457-gphcn"] Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.953629 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:18 crc kubenswrapper[4725]: I0227 06:29:18.962891 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57667b4457-gphcn"] Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.011652 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.089103 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdbzm\" (UniqueName: \"kubernetes.io/projected/ff2b076f-c2ba-43e0-b914-addd791c7be3-kube-api-access-rdbzm\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.089154 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-config\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.089196 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-dns-svc\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.190609 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdbzm\" (UniqueName: \"kubernetes.io/projected/ff2b076f-c2ba-43e0-b914-addd791c7be3-kube-api-access-rdbzm\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.190655 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-config\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.190704 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-dns-svc\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.191734 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-config\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.191831 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-dns-svc\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.213016 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d87c9cc-fv2k9"] Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.214026 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdbzm\" (UniqueName: \"kubernetes.io/projected/ff2b076f-c2ba-43e0-b914-addd791c7be3-kube-api-access-rdbzm\") pod \"dnsmasq-dns-57667b4457-gphcn\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.250660 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9c6656465-qtdhm"] Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.252337 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.263762 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c6656465-qtdhm"] Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.272623 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.393555 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-config\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.393610 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-dns-svc\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.393678 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xt9\" (UniqueName: \"kubernetes.io/projected/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-kube-api-access-k4xt9\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.494815 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-config\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.495776 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-config\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.496835 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-dns-svc\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.497016 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xt9\" (UniqueName: \"kubernetes.io/projected/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-kube-api-access-k4xt9\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.497546 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-dns-svc\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.510870 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xt9\" (UniqueName: \"kubernetes.io/projected/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-kube-api-access-k4xt9\") pod \"dnsmasq-dns-9c6656465-qtdhm\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.570783 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.828685 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.830547 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.832925 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.833272 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.833560 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.833769 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.833779 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.833574 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.836420 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.841477 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-dzrc4" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.903876 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.903987 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904048 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrggb\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-kube-api-access-jrggb\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904098 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc2bb345-ef60-4c05-8461-1821e1db5216-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904123 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904192 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904248 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904344 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc2bb345-ef60-4c05-8461-1821e1db5216-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904402 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:19 crc kubenswrapper[4725]: I0227 06:29:19.904431 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006116 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc2bb345-ef60-4c05-8461-1821e1db5216-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006170 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006248 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006281 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006324 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006354 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc2bb345-ef60-4c05-8461-1821e1db5216-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006385 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006409 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006509 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.006546 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrggb\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-kube-api-access-jrggb\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.007480 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.007634 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.008227 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.008503 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.008556 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.008812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc2bb345-ef60-4c05-8461-1821e1db5216-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.011656 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc2bb345-ef60-4c05-8461-1821e1db5216-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.012806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.014941 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.015646 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc2bb345-ef60-4c05-8461-1821e1db5216-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.023182 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrggb\" (UniqueName: \"kubernetes.io/projected/bc2bb345-ef60-4c05-8461-1821e1db5216-kube-api-access-jrggb\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.029871 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"bc2bb345-ef60-4c05-8461-1821e1db5216\") " pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.086144 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.087259 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.091023 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jkwst" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.091486 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.091748 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.091923 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.092080 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.092271 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.092453 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.100186 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.157807 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209567 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209637 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209695 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb0a4e8-6f65-4961-9caa-18d66a6754af-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209734 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb0a4e8-6f65-4961-9caa-18d66a6754af-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209777 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209836 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209921 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.209979 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g2tv\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-kube-api-access-4g2tv\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.210028 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-config-data\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.210186 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.210268 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.311982 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312043 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312069 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312092 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb0a4e8-6f65-4961-9caa-18d66a6754af-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312112 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb0a4e8-6f65-4961-9caa-18d66a6754af-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312136 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312156 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312212 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g2tv\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-kube-api-access-4g2tv\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312229 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-config-data\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312272 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312496 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312804 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.313076 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.312940 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.313641 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-config-data\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.314253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.314774 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb0a4e8-6f65-4961-9caa-18d66a6754af-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.319061 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb0a4e8-6f65-4961-9caa-18d66a6754af-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.320472 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.320811 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.329869 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g2tv\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-kube-api-access-4g2tv\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.339795 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.394655 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.397180 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.400232 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.400512 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.400799 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.400949 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.401086 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.401228 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g8rzr" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.401440 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.405579 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.422039 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514553 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514606 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ac67077-5fb4-4890-98ba-f5280a08e464-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514625 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514665 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ac67077-5fb4-4890-98ba-f5280a08e464-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514687 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514705 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514726 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514741 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdd8v\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-kube-api-access-rdd8v\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514779 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.514804 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616451 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616500 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616526 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616539 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdd8v\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-kube-api-access-rdd8v\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616584 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616610 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616636 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616662 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ac67077-5fb4-4890-98ba-f5280a08e464-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616677 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616693 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.616717 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ac67077-5fb4-4890-98ba-f5280a08e464-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.617476 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.618043 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.618171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.619629 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.619723 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.620269 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.621234 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ac67077-5fb4-4890-98ba-f5280a08e464-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.621746 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ac67077-5fb4-4890-98ba-f5280a08e464-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.627951 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.630143 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.637984 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdd8v\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-kube-api-access-rdd8v\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.639340 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:20 crc kubenswrapper[4725]: I0227 06:29:20.735710 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.909104 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.910250 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.914355 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cr66n" Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.914678 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.914804 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.977623 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.977764 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 27 06:29:21 crc kubenswrapper[4725]: I0227 06:29:21.981606 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072333 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-kolla-config\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072619 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072670 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072696 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n58qh\" (UniqueName: \"kubernetes.io/projected/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-kube-api-access-n58qh\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072732 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072752 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-config-data-default\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.072769 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.173771 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-kolla-config\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.173835 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.173855 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.173885 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.173912 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n58qh\" (UniqueName: \"kubernetes.io/projected/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-kube-api-access-n58qh\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.173964 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.173985 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-config-data-default\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.174004 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.174411 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.174459 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.175831 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.176954 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.184934 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-kolla-config\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.186158 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-config-data-default\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.186490 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.195834 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.196410 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.197240 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.203406 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.209968 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.226142 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n58qh\" (UniqueName: \"kubernetes.io/projected/7f24b5c8-baad-48b6-9242-2ad6bb6c471f-kube-api-access-n58qh\") pod \"openstack-galera-0\" (UID: \"7f24b5c8-baad-48b6-9242-2ad6bb6c471f\") " pod="openstack/openstack-galera-0" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.315880 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cr66n" Feb 27 06:29:22 crc kubenswrapper[4725]: I0227 06:29:22.325187 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.457520 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.459783 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.465219 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.465441 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.465548 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-jj2fw" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.465646 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.474677 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602161 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602219 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76702ae7-c9e6-485b-abc9-b54e4c073ee1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602240 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76702ae7-c9e6-485b-abc9-b54e4c073ee1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602262 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrdl\" (UniqueName: \"kubernetes.io/projected/76702ae7-c9e6-485b-abc9-b54e4c073ee1-kube-api-access-rlrdl\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602281 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76702ae7-c9e6-485b-abc9-b54e4c073ee1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602346 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602371 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.602409 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.682281 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.684918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.689501 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.689507 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mntxg" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.689700 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.695099 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703235 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76702ae7-c9e6-485b-abc9-b54e4c073ee1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703287 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76702ae7-c9e6-485b-abc9-b54e4c073ee1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703323 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrdl\" (UniqueName: \"kubernetes.io/projected/76702ae7-c9e6-485b-abc9-b54e4c073ee1-kube-api-access-rlrdl\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703352 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76702ae7-c9e6-485b-abc9-b54e4c073ee1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703404 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703427 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703466 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.703488 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.704097 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.704340 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76702ae7-c9e6-485b-abc9-b54e4c073ee1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.704731 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.705415 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.708676 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76702ae7-c9e6-485b-abc9-b54e4c073ee1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.710265 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76702ae7-c9e6-485b-abc9-b54e4c073ee1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.721881 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76702ae7-c9e6-485b-abc9-b54e4c073ee1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.729195 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrdl\" (UniqueName: \"kubernetes.io/projected/76702ae7-c9e6-485b-abc9-b54e4c073ee1-kube-api-access-rlrdl\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.738113 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76702ae7-c9e6-485b-abc9-b54e4c073ee1\") " pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.783747 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.804912 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/163ce132-3935-4648-b50f-fab5db3c17ca-config-data\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.805044 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmxvv\" (UniqueName: \"kubernetes.io/projected/163ce132-3935-4648-b50f-fab5db3c17ca-kube-api-access-mmxvv\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.805075 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ce132-3935-4648-b50f-fab5db3c17ca-combined-ca-bundle\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.805159 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/163ce132-3935-4648-b50f-fab5db3c17ca-kolla-config\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.805186 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/163ce132-3935-4648-b50f-fab5db3c17ca-memcached-tls-certs\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.907017 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmxvv\" (UniqueName: \"kubernetes.io/projected/163ce132-3935-4648-b50f-fab5db3c17ca-kube-api-access-mmxvv\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.907062 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ce132-3935-4648-b50f-fab5db3c17ca-combined-ca-bundle\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.907095 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/163ce132-3935-4648-b50f-fab5db3c17ca-kolla-config\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.907117 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/163ce132-3935-4648-b50f-fab5db3c17ca-memcached-tls-certs\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.907152 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/163ce132-3935-4648-b50f-fab5db3c17ca-config-data\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.908027 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/163ce132-3935-4648-b50f-fab5db3c17ca-config-data\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.908068 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/163ce132-3935-4648-b50f-fab5db3c17ca-kolla-config\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.915238 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163ce132-3935-4648-b50f-fab5db3c17ca-combined-ca-bundle\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.916270 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/163ce132-3935-4648-b50f-fab5db3c17ca-memcached-tls-certs\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.924321 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmxvv\" (UniqueName: \"kubernetes.io/projected/163ce132-3935-4648-b50f-fab5db3c17ca-kube-api-access-mmxvv\") pod \"memcached-0\" (UID: \"163ce132-3935-4648-b50f-fab5db3c17ca\") " pod="openstack/memcached-0" Feb 27 06:29:23 crc kubenswrapper[4725]: I0227 06:29:23.999628 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 06:29:25 crc kubenswrapper[4725]: I0227 06:29:25.695330 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:29:25 crc kubenswrapper[4725]: I0227 06:29:25.696570 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 06:29:25 crc kubenswrapper[4725]: I0227 06:29:25.701784 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mnsh9" Feb 27 06:29:25 crc kubenswrapper[4725]: I0227 06:29:25.704960 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:29:25 crc kubenswrapper[4725]: I0227 06:29:25.843490 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-474tq\" (UniqueName: \"kubernetes.io/projected/b57cf7f3-cfa9-403a-8c71-84b46d6dd189-kube-api-access-474tq\") pod \"kube-state-metrics-0\" (UID: \"b57cf7f3-cfa9-403a-8c71-84b46d6dd189\") " pod="openstack/kube-state-metrics-0" Feb 27 06:29:25 crc kubenswrapper[4725]: I0227 06:29:25.945136 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-474tq\" (UniqueName: \"kubernetes.io/projected/b57cf7f3-cfa9-403a-8c71-84b46d6dd189-kube-api-access-474tq\") pod \"kube-state-metrics-0\" (UID: \"b57cf7f3-cfa9-403a-8c71-84b46d6dd189\") " pod="openstack/kube-state-metrics-0" Feb 27 06:29:25 crc kubenswrapper[4725]: I0227 06:29:25.980544 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-474tq\" (UniqueName: \"kubernetes.io/projected/b57cf7f3-cfa9-403a-8c71-84b46d6dd189-kube-api-access-474tq\") pod \"kube-state-metrics-0\" (UID: \"b57cf7f3-cfa9-403a-8c71-84b46d6dd189\") " pod="openstack/kube-state-metrics-0" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.011421 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.975274 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.977923 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.984116 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.986365 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.987461 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.987684 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.987864 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.988302 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l87pd" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.989240 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.990523 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 27 06:29:26 crc kubenswrapper[4725]: I0227 06:29:26.996944 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112256 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-config\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112329 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112354 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112372 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112399 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpzcl\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-kube-api-access-gpzcl\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112512 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7421655b-5f80-4ec8-94f7-73a189f7460f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112574 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112605 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112824 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.112890 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.214872 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-config\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.214942 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.214977 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215034 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpzcl\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-kube-api-access-gpzcl\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215064 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7421655b-5f80-4ec8-94f7-73a189f7460f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215084 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215100 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215186 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215223 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.215891 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.216193 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.216288 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.218484 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7421655b-5f80-4ec8-94f7-73a189f7460f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.219570 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.220393 4725 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.220510 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e30db3ebf2fd7ac6c73c4f03a68dbdb833990d29c0091afb2dd5fd8d7a51236/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.221345 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.231348 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.231701 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpzcl\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-kube-api-access-gpzcl\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.236585 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-config\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.254085 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:27 crc kubenswrapper[4725]: I0227 06:29:27.307756 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.163146 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6kvbc"] Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.164757 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.168124 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-l4fjv" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.168305 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.168401 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.185144 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kvbc"] Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.228744 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zvrdk"] Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.230216 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.239939 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zvrdk"] Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.354377 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8m5\" (UniqueName: \"kubernetes.io/projected/03406108-89c6-4681-aeba-c6874d465b62-kube-api-access-gv8m5\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.354997 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03406108-89c6-4681-aeba-c6874d465b62-combined-ca-bundle\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355050 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-log-ovn\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355116 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-run\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355139 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-lib\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355169 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-log\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355198 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/03406108-89c6-4681-aeba-c6874d465b62-ovn-controller-tls-certs\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355236 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d05458c2-f003-46ea-a38c-eda2c69b4635-scripts\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355256 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03406108-89c6-4681-aeba-c6874d465b62-scripts\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355291 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/d05458c2-f003-46ea-a38c-eda2c69b4635-kube-api-access-r7mgc\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355363 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-etc-ovs\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355403 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-run-ovn\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.355450 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-run\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457703 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8m5\" (UniqueName: \"kubernetes.io/projected/03406108-89c6-4681-aeba-c6874d465b62-kube-api-access-gv8m5\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457753 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03406108-89c6-4681-aeba-c6874d465b62-combined-ca-bundle\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457787 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-log-ovn\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457818 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-run\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457838 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-lib\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457858 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-log\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457880 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/03406108-89c6-4681-aeba-c6874d465b62-ovn-controller-tls-certs\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457908 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d05458c2-f003-46ea-a38c-eda2c69b4635-scripts\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457921 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03406108-89c6-4681-aeba-c6874d465b62-scripts\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457943 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/d05458c2-f003-46ea-a38c-eda2c69b4635-kube-api-access-r7mgc\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457959 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-etc-ovs\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.457984 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-run-ovn\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458013 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-run\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458540 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-log-ovn\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458578 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-run\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458654 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/03406108-89c6-4681-aeba-c6874d465b62-var-run-ovn\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458667 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-log\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458694 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-lib\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458702 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-var-run\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.458841 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d05458c2-f003-46ea-a38c-eda2c69b4635-etc-ovs\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.461276 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d05458c2-f003-46ea-a38c-eda2c69b4635-scripts\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.461843 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03406108-89c6-4681-aeba-c6874d465b62-scripts\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.462988 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/03406108-89c6-4681-aeba-c6874d465b62-ovn-controller-tls-certs\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.463056 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03406108-89c6-4681-aeba-c6874d465b62-combined-ca-bundle\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.484075 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8m5\" (UniqueName: \"kubernetes.io/projected/03406108-89c6-4681-aeba-c6874d465b62-kube-api-access-gv8m5\") pod \"ovn-controller-6kvbc\" (UID: \"03406108-89c6-4681-aeba-c6874d465b62\") " pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.488252 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/d05458c2-f003-46ea-a38c-eda2c69b4635-kube-api-access-r7mgc\") pod \"ovn-controller-ovs-zvrdk\" (UID: \"d05458c2-f003-46ea-a38c-eda2c69b4635\") " pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.501386 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kvbc" Feb 27 06:29:29 crc kubenswrapper[4725]: I0227 06:29:29.550993 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:32 crc kubenswrapper[4725]: I0227 06:29:32.553841 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:29:32 crc kubenswrapper[4725]: I0227 06:29:32.554331 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.337630 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.339225 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.340896 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.341356 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.341677 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.341947 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qxn58" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.343247 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.351908 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.526601 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a3fa421-de83-44cb-8857-ef6f679f37dc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.526711 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3fa421-de83-44cb-8857-ef6f679f37dc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.526798 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.526855 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.526944 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3fa421-de83-44cb-8857-ef6f679f37dc-config\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.527035 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.527055 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxq4c\" (UniqueName: \"kubernetes.io/projected/8a3fa421-de83-44cb-8857-ef6f679f37dc-kube-api-access-sxq4c\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.527134 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.531744 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.533313 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.536154 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.536315 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.536434 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-w9qth" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.536544 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.542935 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628647 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3fa421-de83-44cb-8857-ef6f679f37dc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628702 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628722 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628745 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628764 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628799 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628815 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3fa421-de83-44cb-8857-ef6f679f37dc-config\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628843 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628865 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628914 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628935 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628952 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxq4c\" (UniqueName: \"kubernetes.io/projected/8a3fa421-de83-44cb-8857-ef6f679f37dc-kube-api-access-sxq4c\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628976 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrpq\" (UniqueName: \"kubernetes.io/projected/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-kube-api-access-2vrpq\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.628996 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-config\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.629020 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.629049 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a3fa421-de83-44cb-8857-ef6f679f37dc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.629369 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.629927 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a3fa421-de83-44cb-8857-ef6f679f37dc-config\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.630463 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a3fa421-de83-44cb-8857-ef6f679f37dc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.630956 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3fa421-de83-44cb-8857-ef6f679f37dc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.635291 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.637132 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.641367 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3fa421-de83-44cb-8857-ef6f679f37dc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.649735 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxq4c\" (UniqueName: \"kubernetes.io/projected/8a3fa421-de83-44cb-8857-ef6f679f37dc-kube-api-access-sxq4c\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.658415 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8a3fa421-de83-44cb-8857-ef6f679f37dc\") " pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.714107 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.730197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-config\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.730402 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.730434 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.730819 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.730876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.731104 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.731440 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-config\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.731469 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.733264 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.733328 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.733371 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrpq\" (UniqueName: \"kubernetes.io/projected/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-kube-api-access-2vrpq\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.734537 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.735882 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.739990 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.748107 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.757478 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.759145 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrpq\" (UniqueName: \"kubernetes.io/projected/67b7afed-e3d9-42c8-9604-9d9e56f1bc1d-kube-api-access-2vrpq\") pod \"ovsdbserver-nb-0\" (UID: \"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d\") " pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:33 crc kubenswrapper[4725]: I0227 06:29:33.852822 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 06:29:38 crc kubenswrapper[4725]: I0227 06:29:38.555225 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c6656465-qtdhm"] Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.931243 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.931611 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.931716 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.203:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gmw9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-688fdbb689-swt8n_openstack(3c60e732-28c5-4ccf-8f0f-c656fcba8ca4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.932905 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-688fdbb689-swt8n" podUID="3c60e732-28c5-4ccf-8f0f-c656fcba8ca4" Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.970288 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.970346 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.970445 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.203:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-727bh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75f8b56f9c-jhj7s_openstack(cc99abc2-145a-4d33-b04c-f5293b264df0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:29:38 crc kubenswrapper[4725]: E0227 06:29:38.972651 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" podUID="cc99abc2-145a-4d33-b04c-f5293b264df0" Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.620019 4725 generic.go:334] "Generic (PLEG): container finished" podID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerID="32a36bcb5697cf7fbd40ee05ddd63690e34560a52c053f76c1e6673439cccfe1" exitCode=0 Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.620589 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" event={"ID":"53bf2b68-ae88-4d0a-af2e-2b4a67e35257","Type":"ContainerDied","Data":"32a36bcb5697cf7fbd40ee05ddd63690e34560a52c053f76c1e6673439cccfe1"} Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.620639 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" event={"ID":"53bf2b68-ae88-4d0a-af2e-2b4a67e35257","Type":"ContainerStarted","Data":"b6ab525b7bddefa16e7d5ec40b8ce52e4d2a9062d9ab44d410f35207bc11091d"} Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.626976 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57667b4457-gphcn"] Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.635312 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:29:39 crc kubenswrapper[4725]: W0227 06:29:39.721583 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f24b5c8_baad_48b6_9242_2ad6bb6c471f.slice/crio-b0e987b20d0a91f44c0993a9a52acea26612d0631011e052fe737c41dbdca5b2 WatchSource:0}: Error finding container b0e987b20d0a91f44c0993a9a52acea26612d0631011e052fe737c41dbdca5b2: Status 404 returned error can't find the container with id b0e987b20d0a91f44c0993a9a52acea26612d0631011e052fe737c41dbdca5b2 Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.724312 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.733699 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 06:29:39 crc kubenswrapper[4725]: I0227 06:29:39.833056 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 06:29:39 crc kubenswrapper[4725]: W0227 06:29:39.840851 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67b7afed_e3d9_42c8_9604_9d9e56f1bc1d.slice/crio-a189227437f628e01837a660c7845552e7f21b94d9b56f20b05134174067623f WatchSource:0}: Error finding container a189227437f628e01837a660c7845552e7f21b94d9b56f20b05134174067623f: Status 404 returned error can't find the container with id a189227437f628e01837a660c7845552e7f21b94d9b56f20b05134174067623f Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.040382 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.046116 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.057477 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-config\") pod \"cc99abc2-145a-4d33-b04c-f5293b264df0\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.057565 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmw9b\" (UniqueName: \"kubernetes.io/projected/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-kube-api-access-gmw9b\") pod \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.058130 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-config" (OuterVolumeSpecName: "config") pod "cc99abc2-145a-4d33-b04c-f5293b264df0" (UID: "cc99abc2-145a-4d33-b04c-f5293b264df0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.075560 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-kube-api-access-gmw9b" (OuterVolumeSpecName: "kube-api-access-gmw9b") pod "3c60e732-28c5-4ccf-8f0f-c656fcba8ca4" (UID: "3c60e732-28c5-4ccf-8f0f-c656fcba8ca4"). InnerVolumeSpecName "kube-api-access-gmw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.159578 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-727bh\" (UniqueName: \"kubernetes.io/projected/cc99abc2-145a-4d33-b04c-f5293b264df0-kube-api-access-727bh\") pod \"cc99abc2-145a-4d33-b04c-f5293b264df0\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.159618 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-dns-svc\") pod \"cc99abc2-145a-4d33-b04c-f5293b264df0\" (UID: \"cc99abc2-145a-4d33-b04c-f5293b264df0\") " Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.159653 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-config\") pod \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\" (UID: \"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4\") " Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.159896 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmw9b\" (UniqueName: \"kubernetes.io/projected/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-kube-api-access-gmw9b\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.159907 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.160241 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-config" (OuterVolumeSpecName: "config") pod "3c60e732-28c5-4ccf-8f0f-c656fcba8ca4" (UID: "3c60e732-28c5-4ccf-8f0f-c656fcba8ca4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.160942 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc99abc2-145a-4d33-b04c-f5293b264df0" (UID: "cc99abc2-145a-4d33-b04c-f5293b264df0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.164618 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc99abc2-145a-4d33-b04c-f5293b264df0-kube-api-access-727bh" (OuterVolumeSpecName: "kube-api-access-727bh") pod "cc99abc2-145a-4d33-b04c-f5293b264df0" (UID: "cc99abc2-145a-4d33-b04c-f5293b264df0"). InnerVolumeSpecName "kube-api-access-727bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.246660 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.261593 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-727bh\" (UniqueName: \"kubernetes.io/projected/cc99abc2-145a-4d33-b04c-f5293b264df0-kube-api-access-727bh\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.261932 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc99abc2-145a-4d33-b04c-f5293b264df0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.261945 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.262754 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.272452 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kvbc"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.276103 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d87c9cc-fv2k9"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.297880 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.306961 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.318129 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:29:40 crc kubenswrapper[4725]: W0227 06:29:40.366576 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7421655b_5f80_4ec8_94f7_73a189f7460f.slice/crio-d7ac80521eff8a269104151bd837d9fc8c8c7ffa4a5e05740e496c86e814133d WatchSource:0}: Error finding container d7ac80521eff8a269104151bd837d9fc8c8c7ffa4a5e05740e496c86e814133d: Status 404 returned error can't find the container with id d7ac80521eff8a269104151bd837d9fc8c8c7ffa4a5e05740e496c86e814133d Feb 27 06:29:40 crc kubenswrapper[4725]: W0227 06:29:40.369261 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03406108_89c6_4681_aeba_c6874d465b62.slice/crio-972db3c4442171c40300a5b909c2f1a118a8ddf01c90bd6edeeba38f0cb7fca1 WatchSource:0}: Error finding container 972db3c4442171c40300a5b909c2f1a118a8ddf01c90bd6edeeba38f0cb7fca1: Status 404 returned error can't find the container with id 972db3c4442171c40300a5b909c2f1a118a8ddf01c90bd6edeeba38f0cb7fca1 Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.387744 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 06:29:40 crc kubenswrapper[4725]: W0227 06:29:40.397344 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76702ae7_c9e6_485b_abc9_b54e4c073ee1.slice/crio-dae1ea13b01bfdc5f5fbb358cd96a495b343a77d88df08f91691f7f272ce33f1 WatchSource:0}: Error finding container dae1ea13b01bfdc5f5fbb358cd96a495b343a77d88df08f91691f7f272ce33f1: Status 404 returned error can't find the container with id dae1ea13b01bfdc5f5fbb358cd96a495b343a77d88df08f91691f7f272ce33f1 Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.484989 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zvrdk"] Feb 27 06:29:40 crc kubenswrapper[4725]: W0227 06:29:40.495241 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd05458c2_f003_46ea_a38c_eda2c69b4635.slice/crio-b9cbde6d77c1eb8926a118607524be502c2b022d4070a60ffdb47c076fc9794c WatchSource:0}: Error finding container b9cbde6d77c1eb8926a118607524be502c2b022d4070a60ffdb47c076fc9794c: Status 404 returned error can't find the container with id b9cbde6d77c1eb8926a118607524be502c2b022d4070a60ffdb47c076fc9794c Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.635402 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvrdk" event={"ID":"d05458c2-f003-46ea-a38c-eda2c69b4635","Type":"ContainerStarted","Data":"b9cbde6d77c1eb8926a118607524be502c2b022d4070a60ffdb47c076fc9794c"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.637758 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d","Type":"ContainerStarted","Data":"a189227437f628e01837a660c7845552e7f21b94d9b56f20b05134174067623f"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.639683 4725 generic.go:334] "Generic (PLEG): container finished" podID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerID="82532be0f76d13c4205f2f37b873be6420ba4f14dea3c720fd976527f2f00fb3" exitCode=0 Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.639725 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57667b4457-gphcn" event={"ID":"ff2b076f-c2ba-43e0-b914-addd791c7be3","Type":"ContainerDied","Data":"82532be0f76d13c4205f2f37b873be6420ba4f14dea3c720fd976527f2f00fb3"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.639777 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57667b4457-gphcn" event={"ID":"ff2b076f-c2ba-43e0-b914-addd791c7be3","Type":"ContainerStarted","Data":"4b784e42c52d1e4bca788f4ae89aea54ff070b3c3cb1be3c0e5cb5136c043360"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.645423 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" event={"ID":"cc99abc2-145a-4d33-b04c-f5293b264df0","Type":"ContainerDied","Data":"3453e470796e8e7371d2febf2cb32205d53b91aefcb3772f647ef80b3c692126"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.645525 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8b56f9c-jhj7s" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.650725 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688fdbb689-swt8n" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.650725 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688fdbb689-swt8n" event={"ID":"3c60e732-28c5-4ccf-8f0f-c656fcba8ca4","Type":"ContainerDied","Data":"bf945f6f474e8d0143082c31e1425392ff8b3e2b10c9e873b91fd7b20716b408"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.658896 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"163ce132-3935-4648-b50f-fab5db3c17ca","Type":"ContainerStarted","Data":"883fcb212c0a3ee938ea49d92547bbb599656916e4ea9f23c2bae19c46049a9a"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.661405 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kvbc" event={"ID":"03406108-89c6-4681-aeba-c6874d465b62","Type":"ContainerStarted","Data":"972db3c4442171c40300a5b909c2f1a118a8ddf01c90bd6edeeba38f0cb7fca1"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.662911 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb0a4e8-6f65-4961-9caa-18d66a6754af","Type":"ContainerStarted","Data":"bc4b52f52ec537f563cacac677b069f6b43b675f9dc8e0b241c9d8b0349946e5"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.664531 4725 generic.go:334] "Generic (PLEG): container finished" podID="fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" containerID="85b0480f0667c48356b5d0827258ae90efcc16eae422addaf6445a7e519ea3b5" exitCode=0 Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.664580 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" event={"ID":"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad","Type":"ContainerDied","Data":"85b0480f0667c48356b5d0827258ae90efcc16eae422addaf6445a7e519ea3b5"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.664595 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" event={"ID":"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad","Type":"ContainerStarted","Data":"5fe7d36125513e94ae1b4bca527634b84de13f0a27c3d9bf7567fd720e2a2858"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.684343 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" event={"ID":"53bf2b68-ae88-4d0a-af2e-2b4a67e35257","Type":"ContainerStarted","Data":"69b1085067c728d7192bc6eaf76f3f4c6c15a9b088a01e3f18c1ff2b5a982947"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.686670 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.690791 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"bc2bb345-ef60-4c05-8461-1821e1db5216","Type":"ContainerStarted","Data":"349ab6274c8d9bab046b6a89aa22b6d25ae6a0d9e208ab4b284f17ca8f351cb3"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.695739 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ac67077-5fb4-4890-98ba-f5280a08e464","Type":"ContainerStarted","Data":"59ae8d274721137a61e6ac4d02c469f52953a315578ae1694d6453e61e687ce7"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.702511 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8a3fa421-de83-44cb-8857-ef6f679f37dc","Type":"ContainerStarted","Data":"6ee165c1fff324db1e5d82827cbb5d43dce0408513199ba552ff01182b4a8154"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.704357 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7f24b5c8-baad-48b6-9242-2ad6bb6c471f","Type":"ContainerStarted","Data":"b0e987b20d0a91f44c0993a9a52acea26612d0631011e052fe737c41dbdca5b2"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.706867 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b57cf7f3-cfa9-403a-8c71-84b46d6dd189","Type":"ContainerStarted","Data":"c6e15fc294d81245870a8b9bdbf6c29e0b1b46827ecf6db37201088db6f3363a"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.707860 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerStarted","Data":"d7ac80521eff8a269104151bd837d9fc8c8c7ffa4a5e05740e496c86e814133d"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.708738 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76702ae7-c9e6-485b-abc9-b54e4c073ee1","Type":"ContainerStarted","Data":"dae1ea13b01bfdc5f5fbb358cd96a495b343a77d88df08f91691f7f272ce33f1"} Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.734177 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f8b56f9c-jhj7s"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.741252 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75f8b56f9c-jhj7s"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.766770 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688fdbb689-swt8n"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.777344 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688fdbb689-swt8n"] Feb 27 06:29:40 crc kubenswrapper[4725]: I0227 06:29:40.786127 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" podStartSLOduration=21.646351828 podStartE2EDuration="21.786109567s" podCreationTimestamp="2026-02-27 06:29:19 +0000 UTC" firstStartedPulling="2026-02-27 06:29:38.918877653 +0000 UTC m=+1157.381498222" lastFinishedPulling="2026-02-27 06:29:39.058635392 +0000 UTC m=+1157.521255961" observedRunningTime="2026-02-27 06:29:40.756940617 +0000 UTC m=+1159.219561216" watchObservedRunningTime="2026-02-27 06:29:40.786109567 +0000 UTC m=+1159.248730126" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.173555 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.287037 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqjfj\" (UniqueName: \"kubernetes.io/projected/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-kube-api-access-bqjfj\") pod \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.287318 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-dns-svc\") pod \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.287364 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-config\") pod \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\" (UID: \"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad\") " Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.291209 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-kube-api-access-bqjfj" (OuterVolumeSpecName: "kube-api-access-bqjfj") pod "fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" (UID: "fdebbc54-95d3-4a4e-b072-ac9b2feb48ad"). InnerVolumeSpecName "kube-api-access-bqjfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.318116 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" (UID: "fdebbc54-95d3-4a4e-b072-ac9b2feb48ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.357057 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-config" (OuterVolumeSpecName: "config") pod "fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" (UID: "fdebbc54-95d3-4a4e-b072-ac9b2feb48ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.391489 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.391529 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.391544 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqjfj\" (UniqueName: \"kubernetes.io/projected/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad-kube-api-access-bqjfj\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.725502 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57667b4457-gphcn" event={"ID":"ff2b076f-c2ba-43e0-b914-addd791c7be3","Type":"ContainerStarted","Data":"c36d10e95e9241c9f0448515486a9e0855f03900b9a781a2bd7527720013e978"} Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.725576 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.731418 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" event={"ID":"fdebbc54-95d3-4a4e-b072-ac9b2feb48ad","Type":"ContainerDied","Data":"5fe7d36125513e94ae1b4bca527634b84de13f0a27c3d9bf7567fd720e2a2858"} Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.731475 4725 scope.go:117] "RemoveContainer" containerID="85b0480f0667c48356b5d0827258ae90efcc16eae422addaf6445a7e519ea3b5" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.731571 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d87c9cc-fv2k9" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.743748 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57667b4457-gphcn" podStartSLOduration=23.743733799 podStartE2EDuration="23.743733799s" podCreationTimestamp="2026-02-27 06:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:29:41.740093667 +0000 UTC m=+1160.202714236" watchObservedRunningTime="2026-02-27 06:29:41.743733799 +0000 UTC m=+1160.206354368" Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.791678 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d87c9cc-fv2k9"] Feb 27 06:29:41 crc kubenswrapper[4725]: I0227 06:29:41.799777 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d87c9cc-fv2k9"] Feb 27 06:29:42 crc kubenswrapper[4725]: I0227 06:29:42.267762 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c60e732-28c5-4ccf-8f0f-c656fcba8ca4" path="/var/lib/kubelet/pods/3c60e732-28c5-4ccf-8f0f-c656fcba8ca4/volumes" Feb 27 06:29:42 crc kubenswrapper[4725]: I0227 06:29:42.268200 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc99abc2-145a-4d33-b04c-f5293b264df0" path="/var/lib/kubelet/pods/cc99abc2-145a-4d33-b04c-f5293b264df0/volumes" Feb 27 06:29:42 crc kubenswrapper[4725]: I0227 06:29:42.268544 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" path="/var/lib/kubelet/pods/fdebbc54-95d3-4a4e-b072-ac9b2feb48ad/volumes" Feb 27 06:29:44 crc kubenswrapper[4725]: I0227 06:29:44.573393 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:29:44 crc kubenswrapper[4725]: I0227 06:29:44.620773 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57667b4457-gphcn"] Feb 27 06:29:44 crc kubenswrapper[4725]: I0227 06:29:44.620976 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57667b4457-gphcn" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerName="dnsmasq-dns" containerID="cri-o://c36d10e95e9241c9f0448515486a9e0855f03900b9a781a2bd7527720013e978" gracePeriod=10 Feb 27 06:29:44 crc kubenswrapper[4725]: I0227 06:29:44.760998 4725 generic.go:334] "Generic (PLEG): container finished" podID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerID="c36d10e95e9241c9f0448515486a9e0855f03900b9a781a2bd7527720013e978" exitCode=0 Feb 27 06:29:44 crc kubenswrapper[4725]: I0227 06:29:44.761061 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57667b4457-gphcn" event={"ID":"ff2b076f-c2ba-43e0-b914-addd791c7be3","Type":"ContainerDied","Data":"c36d10e95e9241c9f0448515486a9e0855f03900b9a781a2bd7527720013e978"} Feb 27 06:29:49 crc kubenswrapper[4725]: I0227 06:29:49.273428 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57667b4457-gphcn" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: connect: connection refused" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.038825 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.039485 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.039670 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n58qh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(7f24b5c8-baad-48b6-9242-2ad6bb6c471f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.040893 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="7f24b5c8-baad-48b6-9242-2ad6bb6c471f" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.077963 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.078023 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.078157 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rlrdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(76702ae7-c9e6-485b-abc9-b54e4c073ee1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.079848 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="76702ae7-c9e6-485b-abc9-b54e4c073ee1" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.163756 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.163812 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.164023 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:38.102.83.203:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n664h687h54bh5f8h558h5ddh67hbdhfch55fh697h588h659h5f4h58dh645h9chf5h4h9hb4hd5h548h5ddh57fh57bh555h56fhd9h675h5bh5dcq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vrpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(67b7afed-e3d9-42c8-9604-9d9e56f1bc1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.396909 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.396954 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.397072 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:38.102.83.203:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n668h57fh666h57fh545h66h59hcch68h676hd7h694h6fh5d8h64bh74h594h96h67fh55chd5h5cbh8dh679h78hf7h55fhb8h598h687h9fh598q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gv8m5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-6kvbc_openstack(03406108-89c6-4681-aeba-c6874d465b62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.398371 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-6kvbc" podUID="03406108-89c6-4681-aeba-c6874d465b62" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.724754 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ovn-sb-db-server:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.724802 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ovn-sb-db-server:watcher_latest" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.724943 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:38.102.83.203:5001/podified-master-centos10/openstack-ovn-sb-db-server:watcher_latest,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n94h68fh589h54bh6h5c9h56fh6hffh85h575h549h5bfhdh694h557h668h5fdh658h97h54h4h55fh696h5bh7dhcfh669h66fhdfh546h56cq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxq4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(8a3fa421-de83-44cb-8857-ef6f679f37dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.840138 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest\\\"\"" pod="openstack/openstack-galera-0" podUID="7f24b5c8-baad-48b6-9242-2ad6bb6c471f" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.840154 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest\\\"\"" pod="openstack/ovn-controller-6kvbc" podUID="03406108-89c6-4681-aeba-c6874d465b62" Feb 27 06:29:53 crc kubenswrapper[4725]: E0227 06:29:53.840188 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-mariadb:watcher_latest\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="76702ae7-c9e6-485b-abc9-b54e4c073ee1" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.172312 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.213815 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdbzm\" (UniqueName: \"kubernetes.io/projected/ff2b076f-c2ba-43e0-b914-addd791c7be3-kube-api-access-rdbzm\") pod \"ff2b076f-c2ba-43e0-b914-addd791c7be3\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.213883 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-config\") pod \"ff2b076f-c2ba-43e0-b914-addd791c7be3\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.213947 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-dns-svc\") pod \"ff2b076f-c2ba-43e0-b914-addd791c7be3\" (UID: \"ff2b076f-c2ba-43e0-b914-addd791c7be3\") " Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.220468 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2b076f-c2ba-43e0-b914-addd791c7be3-kube-api-access-rdbzm" (OuterVolumeSpecName: "kube-api-access-rdbzm") pod "ff2b076f-c2ba-43e0-b914-addd791c7be3" (UID: "ff2b076f-c2ba-43e0-b914-addd791c7be3"). InnerVolumeSpecName "kube-api-access-rdbzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.285080 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-config" (OuterVolumeSpecName: "config") pod "ff2b076f-c2ba-43e0-b914-addd791c7be3" (UID: "ff2b076f-c2ba-43e0-b914-addd791c7be3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.292953 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff2b076f-c2ba-43e0-b914-addd791c7be3" (UID: "ff2b076f-c2ba-43e0-b914-addd791c7be3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.315507 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.315534 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdbzm\" (UniqueName: \"kubernetes.io/projected/ff2b076f-c2ba-43e0-b914-addd791c7be3-kube-api-access-rdbzm\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.315547 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2b076f-c2ba-43e0-b914-addd791c7be3-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:29:54 crc kubenswrapper[4725]: E0227 06:29:54.695971 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 27 06:29:54 crc kubenswrapper[4725]: E0227 06:29:54.696347 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 27 06:29:54 crc kubenswrapper[4725]: E0227 06:29:54.696545 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-474tq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(b57cf7f3-cfa9-403a-8c71-84b46d6dd189): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 06:29:54 crc kubenswrapper[4725]: E0227 06:29:54.698060 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.859647 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57667b4457-gphcn" event={"ID":"ff2b076f-c2ba-43e0-b914-addd791c7be3","Type":"ContainerDied","Data":"4b784e42c52d1e4bca788f4ae89aea54ff070b3c3cb1be3c0e5cb5136c043360"} Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.859738 4725 scope.go:117] "RemoveContainer" containerID="c36d10e95e9241c9f0448515486a9e0855f03900b9a781a2bd7527720013e978" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.859668 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57667b4457-gphcn" Feb 27 06:29:54 crc kubenswrapper[4725]: E0227 06:29:54.861996 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.915680 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57667b4457-gphcn"] Feb 27 06:29:54 crc kubenswrapper[4725]: I0227 06:29:54.918233 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57667b4457-gphcn"] Feb 27 06:29:55 crc kubenswrapper[4725]: I0227 06:29:55.453735 4725 scope.go:117] "RemoveContainer" containerID="82532be0f76d13c4205f2f37b873be6420ba4f14dea3c720fd976527f2f00fb3" Feb 27 06:29:55 crc kubenswrapper[4725]: I0227 06:29:55.872369 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"163ce132-3935-4648-b50f-fab5db3c17ca","Type":"ContainerStarted","Data":"0968d6f5f6ff1273f598903c870a82f13af10d62ef3914bd718093eada0dea3f"} Feb 27 06:29:55 crc kubenswrapper[4725]: I0227 06:29:55.872820 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 27 06:29:55 crc kubenswrapper[4725]: I0227 06:29:55.893051 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.962081828 podStartE2EDuration="32.893027674s" podCreationTimestamp="2026-02-27 06:29:23 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.274392831 +0000 UTC m=+1158.737013400" lastFinishedPulling="2026-02-27 06:29:54.205338677 +0000 UTC m=+1172.667959246" observedRunningTime="2026-02-27 06:29:55.892060056 +0000 UTC m=+1174.354680635" watchObservedRunningTime="2026-02-27 06:29:55.893027674 +0000 UTC m=+1174.355648283" Feb 27 06:29:55 crc kubenswrapper[4725]: E0227 06:29:55.931518 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="8a3fa421-de83-44cb-8857-ef6f679f37dc" Feb 27 06:29:55 crc kubenswrapper[4725]: E0227 06:29:55.935164 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="67b7afed-e3d9-42c8-9604-9d9e56f1bc1d" Feb 27 06:29:56 crc kubenswrapper[4725]: I0227 06:29:56.261815 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" path="/var/lib/kubelet/pods/ff2b076f-c2ba-43e0-b914-addd791c7be3/volumes" Feb 27 06:29:56 crc kubenswrapper[4725]: I0227 06:29:56.881839 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8a3fa421-de83-44cb-8857-ef6f679f37dc","Type":"ContainerStarted","Data":"8deb1ed7cb6e0fb09769727b5432a7dc0f741d7daadaa8ce8dd556a2ef100c21"} Feb 27 06:29:56 crc kubenswrapper[4725]: I0227 06:29:56.883657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d","Type":"ContainerStarted","Data":"abae3c782a85e1d1cf86dca3bfed0f4f90f794a46e7a1a9a4ad22a5ae039efef"} Feb 27 06:29:56 crc kubenswrapper[4725]: E0227 06:29:56.885213 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-ovn-sb-db-server:watcher_latest\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="8a3fa421-de83-44cb-8857-ef6f679f37dc" Feb 27 06:29:56 crc kubenswrapper[4725]: E0227 06:29:56.885772 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="67b7afed-e3d9-42c8-9604-9d9e56f1bc1d" Feb 27 06:29:56 crc kubenswrapper[4725]: I0227 06:29:56.886066 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvrdk" event={"ID":"d05458c2-f003-46ea-a38c-eda2c69b4635","Type":"ContainerStarted","Data":"4db80299ff8bd5c5127f13fd937e4fa91a6fed992667f86856c5fc80115a4891"} Feb 27 06:29:57 crc kubenswrapper[4725]: I0227 06:29:57.899878 4725 generic.go:334] "Generic (PLEG): container finished" podID="d05458c2-f003-46ea-a38c-eda2c69b4635" containerID="4db80299ff8bd5c5127f13fd937e4fa91a6fed992667f86856c5fc80115a4891" exitCode=0 Feb 27 06:29:57 crc kubenswrapper[4725]: I0227 06:29:57.899956 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvrdk" event={"ID":"d05458c2-f003-46ea-a38c-eda2c69b4635","Type":"ContainerDied","Data":"4db80299ff8bd5c5127f13fd937e4fa91a6fed992667f86856c5fc80115a4891"} Feb 27 06:29:57 crc kubenswrapper[4725]: I0227 06:29:57.903748 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"bc2bb345-ef60-4c05-8461-1821e1db5216","Type":"ContainerStarted","Data":"099bf6ddcc61dc4f4862edbdcaba5b56beff3b6683678f541068a40c9e58e264"} Feb 27 06:29:57 crc kubenswrapper[4725]: I0227 06:29:57.907657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb0a4e8-6f65-4961-9caa-18d66a6754af","Type":"ContainerStarted","Data":"49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2"} Feb 27 06:29:57 crc kubenswrapper[4725]: E0227 06:29:57.909526 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="67b7afed-e3d9-42c8-9604-9d9e56f1bc1d" Feb 27 06:29:57 crc kubenswrapper[4725]: E0227 06:29:57.916559 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-ovn-sb-db-server:watcher_latest\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="8a3fa421-de83-44cb-8857-ef6f679f37dc" Feb 27 06:29:58 crc kubenswrapper[4725]: I0227 06:29:58.917469 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ac67077-5fb4-4890-98ba-f5280a08e464","Type":"ContainerStarted","Data":"4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b"} Feb 27 06:29:58 crc kubenswrapper[4725]: I0227 06:29:58.919518 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerStarted","Data":"5e218148737f8bad9851446477af91e18255c885078c85ef97b521e222b40d2d"} Feb 27 06:29:58 crc kubenswrapper[4725]: I0227 06:29:58.922008 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvrdk" event={"ID":"d05458c2-f003-46ea-a38c-eda2c69b4635","Type":"ContainerStarted","Data":"7edf16cd55fcc4483c09b5539e91e36e5513897045660fa118e731ae9f21bac8"} Feb 27 06:29:58 crc kubenswrapper[4725]: I0227 06:29:58.922045 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zvrdk" event={"ID":"d05458c2-f003-46ea-a38c-eda2c69b4635","Type":"ContainerStarted","Data":"a803cbbcc61865da0ae78eb3567afe1cdc0e338bab0a1163b4c79290fb5a51a7"} Feb 27 06:29:58 crc kubenswrapper[4725]: I0227 06:29:58.922169 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:58 crc kubenswrapper[4725]: I0227 06:29:58.922387 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:29:59 crc kubenswrapper[4725]: I0227 06:29:59.014479 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zvrdk" podStartSLOduration=16.199666247 podStartE2EDuration="30.014457267s" podCreationTimestamp="2026-02-27 06:29:29 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.499802548 +0000 UTC m=+1158.962423117" lastFinishedPulling="2026-02-27 06:29:54.314593568 +0000 UTC m=+1172.777214137" observedRunningTime="2026-02-27 06:29:59.001602736 +0000 UTC m=+1177.464223345" watchObservedRunningTime="2026-02-27 06:29:59.014457267 +0000 UTC m=+1177.477077856" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.169426 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536230-wfrlq"] Feb 27 06:30:00 crc kubenswrapper[4725]: E0227 06:30:00.170361 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerName="dnsmasq-dns" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.170385 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerName="dnsmasq-dns" Feb 27 06:30:00 crc kubenswrapper[4725]: E0227 06:30:00.170435 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" containerName="init" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.170448 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" containerName="init" Feb 27 06:30:00 crc kubenswrapper[4725]: E0227 06:30:00.170465 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerName="init" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.170478 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerName="init" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.170835 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdebbc54-95d3-4a4e-b072-ac9b2feb48ad" containerName="init" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.170874 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2b076f-c2ba-43e0-b914-addd791c7be3" containerName="dnsmasq-dns" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.171723 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.174575 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.174680 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.174993 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.186657 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m"] Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.188167 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.193736 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.198944 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536230-wfrlq"] Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.203514 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.218343 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m"] Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.312631 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dvth\" (UniqueName: \"kubernetes.io/projected/1cd6e930-ab13-4ead-9173-1ecc6c561944-kube-api-access-8dvth\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.312715 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cd6e930-ab13-4ead-9173-1ecc6c561944-secret-volume\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.312760 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cd6e930-ab13-4ead-9173-1ecc6c561944-config-volume\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.312948 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnvr\" (UniqueName: \"kubernetes.io/projected/f015d17a-1381-410a-92d4-a28b0a4a4b1b-kube-api-access-tqnvr\") pod \"auto-csr-approver-29536230-wfrlq\" (UID: \"f015d17a-1381-410a-92d4-a28b0a4a4b1b\") " pod="openshift-infra/auto-csr-approver-29536230-wfrlq" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.414571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnvr\" (UniqueName: \"kubernetes.io/projected/f015d17a-1381-410a-92d4-a28b0a4a4b1b-kube-api-access-tqnvr\") pod \"auto-csr-approver-29536230-wfrlq\" (UID: \"f015d17a-1381-410a-92d4-a28b0a4a4b1b\") " pod="openshift-infra/auto-csr-approver-29536230-wfrlq" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.414648 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dvth\" (UniqueName: \"kubernetes.io/projected/1cd6e930-ab13-4ead-9173-1ecc6c561944-kube-api-access-8dvth\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.414710 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cd6e930-ab13-4ead-9173-1ecc6c561944-secret-volume\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.414752 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cd6e930-ab13-4ead-9173-1ecc6c561944-config-volume\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.415987 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cd6e930-ab13-4ead-9173-1ecc6c561944-config-volume\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.432963 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnvr\" (UniqueName: \"kubernetes.io/projected/f015d17a-1381-410a-92d4-a28b0a4a4b1b-kube-api-access-tqnvr\") pod \"auto-csr-approver-29536230-wfrlq\" (UID: \"f015d17a-1381-410a-92d4-a28b0a4a4b1b\") " pod="openshift-infra/auto-csr-approver-29536230-wfrlq" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.433171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cd6e930-ab13-4ead-9173-1ecc6c561944-secret-volume\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.441213 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dvth\" (UniqueName: \"kubernetes.io/projected/1cd6e930-ab13-4ead-9173-1ecc6c561944-kube-api-access-8dvth\") pod \"collect-profiles-29536230-9h85m\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.504924 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.532686 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:00 crc kubenswrapper[4725]: I0227 06:30:00.939699 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536230-wfrlq"] Feb 27 06:30:00 crc kubenswrapper[4725]: W0227 06:30:00.946408 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf015d17a_1381_410a_92d4_a28b0a4a4b1b.slice/crio-69ef4a6b84cd36e7110b60d2b57c1e66bbd0915e978472616cb7c594cdcd566d WatchSource:0}: Error finding container 69ef4a6b84cd36e7110b60d2b57c1e66bbd0915e978472616cb7c594cdcd566d: Status 404 returned error can't find the container with id 69ef4a6b84cd36e7110b60d2b57c1e66bbd0915e978472616cb7c594cdcd566d Feb 27 06:30:01 crc kubenswrapper[4725]: W0227 06:30:01.010416 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cd6e930_ab13_4ead_9173_1ecc6c561944.slice/crio-959995e857d95a4594c1847962f978c28b86cef042b5f7636d9df32790894af8 WatchSource:0}: Error finding container 959995e857d95a4594c1847962f978c28b86cef042b5f7636d9df32790894af8: Status 404 returned error can't find the container with id 959995e857d95a4594c1847962f978c28b86cef042b5f7636d9df32790894af8 Feb 27 06:30:01 crc kubenswrapper[4725]: I0227 06:30:01.012407 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m"] Feb 27 06:30:01 crc kubenswrapper[4725]: I0227 06:30:01.948364 4725 generic.go:334] "Generic (PLEG): container finished" podID="1cd6e930-ab13-4ead-9173-1ecc6c561944" containerID="83b4e247ca46f7289ff1df656689e037beda39fe69ad57127cbfe819ceea5984" exitCode=0 Feb 27 06:30:01 crc kubenswrapper[4725]: I0227 06:30:01.948578 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" event={"ID":"1cd6e930-ab13-4ead-9173-1ecc6c561944","Type":"ContainerDied","Data":"83b4e247ca46f7289ff1df656689e037beda39fe69ad57127cbfe819ceea5984"} Feb 27 06:30:01 crc kubenswrapper[4725]: I0227 06:30:01.948815 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" event={"ID":"1cd6e930-ab13-4ead-9173-1ecc6c561944","Type":"ContainerStarted","Data":"959995e857d95a4594c1847962f978c28b86cef042b5f7636d9df32790894af8"} Feb 27 06:30:01 crc kubenswrapper[4725]: I0227 06:30:01.951227 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" event={"ID":"f015d17a-1381-410a-92d4-a28b0a4a4b1b","Type":"ContainerStarted","Data":"69ef4a6b84cd36e7110b60d2b57c1e66bbd0915e978472616cb7c594cdcd566d"} Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.554964 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.555362 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.555410 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.555882 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"861186ed7f2e4b7df76123b88a35b60ba94275897d0a13a296d2198ea2a7a166"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.555960 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://861186ed7f2e4b7df76123b88a35b60ba94275897d0a13a296d2198ea2a7a166" gracePeriod=600 Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.962040 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="861186ed7f2e4b7df76123b88a35b60ba94275897d0a13a296d2198ea2a7a166" exitCode=0 Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.962218 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"861186ed7f2e4b7df76123b88a35b60ba94275897d0a13a296d2198ea2a7a166"} Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.962405 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"28426146ca35fe9273793a5717f9e41fb0368e16b25b6d5b4d504e333b929ead"} Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.962426 4725 scope.go:117] "RemoveContainer" containerID="03dc8bea10798b61bde03e0e8912868ddc55a9db35d9f15615b091af21e96406" Feb 27 06:30:02 crc kubenswrapper[4725]: I0227 06:30:02.966170 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" event={"ID":"f015d17a-1381-410a-92d4-a28b0a4a4b1b","Type":"ContainerStarted","Data":"2b67a7c7e9507fe801321c5b5c42386650fa86500c931c5f96c64d0817d01649"} Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.012491 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" podStartSLOduration=1.389916449 podStartE2EDuration="3.012468854s" podCreationTimestamp="2026-02-27 06:30:00 +0000 UTC" firstStartedPulling="2026-02-27 06:30:00.94957772 +0000 UTC m=+1179.412198289" lastFinishedPulling="2026-02-27 06:30:02.572130125 +0000 UTC m=+1181.034750694" observedRunningTime="2026-02-27 06:30:03.002261377 +0000 UTC m=+1181.464881966" watchObservedRunningTime="2026-02-27 06:30:03.012468854 +0000 UTC m=+1181.475089423" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.257280 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.366016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dvth\" (UniqueName: \"kubernetes.io/projected/1cd6e930-ab13-4ead-9173-1ecc6c561944-kube-api-access-8dvth\") pod \"1cd6e930-ab13-4ead-9173-1ecc6c561944\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.366071 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cd6e930-ab13-4ead-9173-1ecc6c561944-config-volume\") pod \"1cd6e930-ab13-4ead-9173-1ecc6c561944\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.366216 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cd6e930-ab13-4ead-9173-1ecc6c561944-secret-volume\") pod \"1cd6e930-ab13-4ead-9173-1ecc6c561944\" (UID: \"1cd6e930-ab13-4ead-9173-1ecc6c561944\") " Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.366909 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cd6e930-ab13-4ead-9173-1ecc6c561944-config-volume" (OuterVolumeSpecName: "config-volume") pod "1cd6e930-ab13-4ead-9173-1ecc6c561944" (UID: "1cd6e930-ab13-4ead-9173-1ecc6c561944"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.371179 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd6e930-ab13-4ead-9173-1ecc6c561944-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1cd6e930-ab13-4ead-9173-1ecc6c561944" (UID: "1cd6e930-ab13-4ead-9173-1ecc6c561944"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.371357 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd6e930-ab13-4ead-9173-1ecc6c561944-kube-api-access-8dvth" (OuterVolumeSpecName: "kube-api-access-8dvth") pod "1cd6e930-ab13-4ead-9173-1ecc6c561944" (UID: "1cd6e930-ab13-4ead-9173-1ecc6c561944"). InnerVolumeSpecName "kube-api-access-8dvth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.468442 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cd6e930-ab13-4ead-9173-1ecc6c561944-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.468476 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dvth\" (UniqueName: \"kubernetes.io/projected/1cd6e930-ab13-4ead-9173-1ecc6c561944-kube-api-access-8dvth\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.468485 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cd6e930-ab13-4ead-9173-1ecc6c561944-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.980931 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" event={"ID":"1cd6e930-ab13-4ead-9173-1ecc6c561944","Type":"ContainerDied","Data":"959995e857d95a4594c1847962f978c28b86cef042b5f7636d9df32790894af8"} Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.980980 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959995e857d95a4594c1847962f978c28b86cef042b5f7636d9df32790894af8" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.981095 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m" Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.983995 4725 generic.go:334] "Generic (PLEG): container finished" podID="f015d17a-1381-410a-92d4-a28b0a4a4b1b" containerID="2b67a7c7e9507fe801321c5b5c42386650fa86500c931c5f96c64d0817d01649" exitCode=0 Feb 27 06:30:03 crc kubenswrapper[4725]: I0227 06:30:03.984192 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" event={"ID":"f015d17a-1381-410a-92d4-a28b0a4a4b1b","Type":"ContainerDied","Data":"2b67a7c7e9507fe801321c5b5c42386650fa86500c931c5f96c64d0817d01649"} Feb 27 06:30:04 crc kubenswrapper[4725]: I0227 06:30:04.001414 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 27 06:30:04 crc kubenswrapper[4725]: I0227 06:30:04.997039 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7f24b5c8-baad-48b6-9242-2ad6bb6c471f","Type":"ContainerStarted","Data":"72092b2c7a382f04ec46d80835ca5466b7cc25cb21a103ed98dfc2ad8a23a663"} Feb 27 06:30:05 crc kubenswrapper[4725]: I0227 06:30:05.388226 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" Feb 27 06:30:05 crc kubenswrapper[4725]: I0227 06:30:05.499963 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqnvr\" (UniqueName: \"kubernetes.io/projected/f015d17a-1381-410a-92d4-a28b0a4a4b1b-kube-api-access-tqnvr\") pod \"f015d17a-1381-410a-92d4-a28b0a4a4b1b\" (UID: \"f015d17a-1381-410a-92d4-a28b0a4a4b1b\") " Feb 27 06:30:05 crc kubenswrapper[4725]: I0227 06:30:05.511015 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f015d17a-1381-410a-92d4-a28b0a4a4b1b-kube-api-access-tqnvr" (OuterVolumeSpecName: "kube-api-access-tqnvr") pod "f015d17a-1381-410a-92d4-a28b0a4a4b1b" (UID: "f015d17a-1381-410a-92d4-a28b0a4a4b1b"). InnerVolumeSpecName "kube-api-access-tqnvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:05 crc kubenswrapper[4725]: I0227 06:30:05.601603 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqnvr\" (UniqueName: \"kubernetes.io/projected/f015d17a-1381-410a-92d4-a28b0a4a4b1b-kube-api-access-tqnvr\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.005230 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" event={"ID":"f015d17a-1381-410a-92d4-a28b0a4a4b1b","Type":"ContainerDied","Data":"69ef4a6b84cd36e7110b60d2b57c1e66bbd0915e978472616cb7c594cdcd566d"} Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.005586 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69ef4a6b84cd36e7110b60d2b57c1e66bbd0915e978472616cb7c594cdcd566d" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.005233 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536230-wfrlq" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.006794 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76702ae7-c9e6-485b-abc9-b54e4c073ee1","Type":"ContainerStarted","Data":"a32b31db332adaa777b64c4359972047866cd3b7da2f19eeccb0da4a4a785a5a"} Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.038721 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d4746dfb9-lgz99"] Feb 27 06:30:06 crc kubenswrapper[4725]: E0227 06:30:06.039124 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd6e930-ab13-4ead-9173-1ecc6c561944" containerName="collect-profiles" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.039141 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd6e930-ab13-4ead-9173-1ecc6c561944" containerName="collect-profiles" Feb 27 06:30:06 crc kubenswrapper[4725]: E0227 06:30:06.039169 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f015d17a-1381-410a-92d4-a28b0a4a4b1b" containerName="oc" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.039177 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f015d17a-1381-410a-92d4-a28b0a4a4b1b" containerName="oc" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.039385 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd6e930-ab13-4ead-9173-1ecc6c561944" containerName="collect-profiles" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.039413 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f015d17a-1381-410a-92d4-a28b0a4a4b1b" containerName="oc" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.040493 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.074810 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4746dfb9-lgz99"] Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.211241 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-config\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.211342 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-dns-svc\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.211402 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7s25\" (UniqueName: \"kubernetes.io/projected/a73dee1c-bbf2-4637-b618-ee9baee91212-kube-api-access-f7s25\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.312525 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7s25\" (UniqueName: \"kubernetes.io/projected/a73dee1c-bbf2-4637-b618-ee9baee91212-kube-api-access-f7s25\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.312631 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-config\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.312677 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-dns-svc\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.313499 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-config\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.313497 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-dns-svc\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.327630 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7s25\" (UniqueName: \"kubernetes.io/projected/a73dee1c-bbf2-4637-b618-ee9baee91212-kube-api-access-f7s25\") pod \"dnsmasq-dns-7d4746dfb9-lgz99\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.362781 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.505564 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536224-m5jcf"] Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.512760 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536224-m5jcf"] Feb 27 06:30:06 crc kubenswrapper[4725]: I0227 06:30:06.818624 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4746dfb9-lgz99"] Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.013426 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" event={"ID":"a73dee1c-bbf2-4637-b618-ee9baee91212","Type":"ContainerStarted","Data":"a5e544fb659f275201a7603338ebcccc2fc5131665a168ab740c8050dab7cfc4"} Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.013684 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" event={"ID":"a73dee1c-bbf2-4637-b618-ee9baee91212","Type":"ContainerStarted","Data":"87ff56aa3c149ec25a1d4234be53c17ea3edd83e90cd261e9ef4142bdd103821"} Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.014832 4725 generic.go:334] "Generic (PLEG): container finished" podID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerID="5e218148737f8bad9851446477af91e18255c885078c85ef97b521e222b40d2d" exitCode=0 Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.014867 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerDied","Data":"5e218148737f8bad9851446477af91e18255c885078c85ef97b521e222b40d2d"} Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.016059 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.144716 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.155531 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.158000 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.158451 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.158667 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-nsfp4" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.159298 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.160859 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.226632 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs8rc\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-kube-api-access-hs8rc\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.227001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.227197 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.227400 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/872eba69-b1d2-4028-b65f-b70fa14daeb0-lock\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.227554 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872eba69-b1d2-4028-b65f-b70fa14daeb0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.227727 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/872eba69-b1d2-4028-b65f-b70fa14daeb0-cache\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.335248 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs8rc\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-kube-api-access-hs8rc\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.335315 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.335350 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.335375 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/872eba69-b1d2-4028-b65f-b70fa14daeb0-lock\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.335388 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872eba69-b1d2-4028-b65f-b70fa14daeb0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.335408 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/872eba69-b1d2-4028-b65f-b70fa14daeb0-cache\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.336730 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.336955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/872eba69-b1d2-4028-b65f-b70fa14daeb0-lock\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.336979 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/872eba69-b1d2-4028-b65f-b70fa14daeb0-cache\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: E0227 06:30:07.337063 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 06:30:07 crc kubenswrapper[4725]: E0227 06:30:07.337076 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 06:30:07 crc kubenswrapper[4725]: E0227 06:30:07.337111 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift podName:872eba69-b1d2-4028-b65f-b70fa14daeb0 nodeName:}" failed. No retries permitted until 2026-02-27 06:30:07.837097364 +0000 UTC m=+1186.299717933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift") pod "swift-storage-0" (UID: "872eba69-b1d2-4028-b65f-b70fa14daeb0") : configmap "swift-ring-files" not found Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.343874 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872eba69-b1d2-4028-b65f-b70fa14daeb0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.355015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs8rc\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-kube-api-access-hs8rc\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.355944 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: I0227 06:30:07.844947 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:07 crc kubenswrapper[4725]: E0227 06:30:07.845156 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 06:30:07 crc kubenswrapper[4725]: E0227 06:30:07.845350 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 06:30:07 crc kubenswrapper[4725]: E0227 06:30:07.845416 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift podName:872eba69-b1d2-4028-b65f-b70fa14daeb0 nodeName:}" failed. No retries permitted until 2026-02-27 06:30:08.845397955 +0000 UTC m=+1187.308018534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift") pod "swift-storage-0" (UID: "872eba69-b1d2-4028-b65f-b70fa14daeb0") : configmap "swift-ring-files" not found Feb 27 06:30:08 crc kubenswrapper[4725]: I0227 06:30:08.025841 4725 generic.go:334] "Generic (PLEG): container finished" podID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerID="a5e544fb659f275201a7603338ebcccc2fc5131665a168ab740c8050dab7cfc4" exitCode=0 Feb 27 06:30:08 crc kubenswrapper[4725]: I0227 06:30:08.025892 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" event={"ID":"a73dee1c-bbf2-4637-b618-ee9baee91212","Type":"ContainerDied","Data":"a5e544fb659f275201a7603338ebcccc2fc5131665a168ab740c8050dab7cfc4"} Feb 27 06:30:08 crc kubenswrapper[4725]: I0227 06:30:08.265508 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab9bd04-3665-4720-8cf9-9fb9cf78a016" path="/var/lib/kubelet/pods/fab9bd04-3665-4720-8cf9-9fb9cf78a016/volumes" Feb 27 06:30:08 crc kubenswrapper[4725]: I0227 06:30:08.861152 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:08 crc kubenswrapper[4725]: E0227 06:30:08.861345 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 06:30:08 crc kubenswrapper[4725]: E0227 06:30:08.861366 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 06:30:08 crc kubenswrapper[4725]: E0227 06:30:08.861445 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift podName:872eba69-b1d2-4028-b65f-b70fa14daeb0 nodeName:}" failed. No retries permitted until 2026-02-27 06:30:10.861427368 +0000 UTC m=+1189.324047937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift") pod "swift-storage-0" (UID: "872eba69-b1d2-4028-b65f-b70fa14daeb0") : configmap "swift-ring-files" not found Feb 27 06:30:09 crc kubenswrapper[4725]: I0227 06:30:09.032943 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" event={"ID":"a73dee1c-bbf2-4637-b618-ee9baee91212","Type":"ContainerStarted","Data":"e446b1467990d0319f6fa5f6e6829d0195dec81e99b0e90ae70e2c12e92ec51b"} Feb 27 06:30:09 crc kubenswrapper[4725]: I0227 06:30:09.033904 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:09 crc kubenswrapper[4725]: I0227 06:30:09.035615 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b57cf7f3-cfa9-403a-8c71-84b46d6dd189","Type":"ContainerStarted","Data":"eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2"} Feb 27 06:30:09 crc kubenswrapper[4725]: I0227 06:30:09.036007 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 06:30:09 crc kubenswrapper[4725]: I0227 06:30:09.056718 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" podStartSLOduration=3.056703048 podStartE2EDuration="3.056703048s" podCreationTimestamp="2026-02-27 06:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:09.05109801 +0000 UTC m=+1187.513718579" watchObservedRunningTime="2026-02-27 06:30:09.056703048 +0000 UTC m=+1187.519323617" Feb 27 06:30:09 crc kubenswrapper[4725]: I0227 06:30:09.070956 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.024607309 podStartE2EDuration="44.070940398s" podCreationTimestamp="2026-02-27 06:29:25 +0000 UTC" firstStartedPulling="2026-02-27 06:29:39.658890618 +0000 UTC m=+1158.121511187" lastFinishedPulling="2026-02-27 06:30:08.705223687 +0000 UTC m=+1187.167844276" observedRunningTime="2026-02-27 06:30:09.070174487 +0000 UTC m=+1187.532795056" watchObservedRunningTime="2026-02-27 06:30:09.070940398 +0000 UTC m=+1187.533560957" Feb 27 06:30:10 crc kubenswrapper[4725]: I0227 06:30:10.045712 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kvbc" event={"ID":"03406108-89c6-4681-aeba-c6874d465b62","Type":"ContainerStarted","Data":"c578b0af3ada7cc40d25d26d04abb28c1d2b430be71574fcc69a6f7a4f1cd116"} Feb 27 06:30:10 crc kubenswrapper[4725]: I0227 06:30:10.046714 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6kvbc" Feb 27 06:30:10 crc kubenswrapper[4725]: I0227 06:30:10.067487 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6kvbc" podStartSLOduration=12.093812187 podStartE2EDuration="41.067466174s" podCreationTimestamp="2026-02-27 06:29:29 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.39425447 +0000 UTC m=+1158.856875039" lastFinishedPulling="2026-02-27 06:30:09.367908457 +0000 UTC m=+1187.830529026" observedRunningTime="2026-02-27 06:30:10.063107752 +0000 UTC m=+1188.525728331" watchObservedRunningTime="2026-02-27 06:30:10.067466174 +0000 UTC m=+1188.530086743" Feb 27 06:30:10 crc kubenswrapper[4725]: I0227 06:30:10.901344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:10 crc kubenswrapper[4725]: E0227 06:30:10.901614 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 06:30:10 crc kubenswrapper[4725]: E0227 06:30:10.901643 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 06:30:10 crc kubenswrapper[4725]: E0227 06:30:10.901746 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift podName:872eba69-b1d2-4028-b65f-b70fa14daeb0 nodeName:}" failed. No retries permitted until 2026-02-27 06:30:14.901722578 +0000 UTC m=+1193.364343157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift") pod "swift-storage-0" (UID: "872eba69-b1d2-4028-b65f-b70fa14daeb0") : configmap "swift-ring-files" not found Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.180865 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bq24l"] Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.182600 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bq24l"] Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.182768 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.189666 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.189835 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.190855 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.310581 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-combined-ca-bundle\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.310700 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-scripts\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.310731 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-ring-data-devices\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.310802 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-dispersionconf\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.310897 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40a2ae59-8725-42be-984a-739a82d476c5-etc-swift\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.310924 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-swiftconf\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.310947 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j65j\" (UniqueName: \"kubernetes.io/projected/40a2ae59-8725-42be-984a-739a82d476c5-kube-api-access-5j65j\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412205 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-ring-data-devices\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412311 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-dispersionconf\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412378 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40a2ae59-8725-42be-984a-739a82d476c5-etc-swift\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412400 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-swiftconf\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412422 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j65j\" (UniqueName: \"kubernetes.io/projected/40a2ae59-8725-42be-984a-739a82d476c5-kube-api-access-5j65j\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412459 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-combined-ca-bundle\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412516 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-scripts\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.412972 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40a2ae59-8725-42be-984a-739a82d476c5-etc-swift\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.413043 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-ring-data-devices\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.413157 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-scripts\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.424674 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-dispersionconf\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.424728 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-swiftconf\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.429740 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j65j\" (UniqueName: \"kubernetes.io/projected/40a2ae59-8725-42be-984a-739a82d476c5-kube-api-access-5j65j\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.431068 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-combined-ca-bundle\") pod \"swift-ring-rebalance-bq24l\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:11 crc kubenswrapper[4725]: I0227 06:30:11.512403 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:12 crc kubenswrapper[4725]: I0227 06:30:12.047271 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bq24l"] Feb 27 06:30:12 crc kubenswrapper[4725]: W0227 06:30:12.058538 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40a2ae59_8725_42be_984a_739a82d476c5.slice/crio-36ca0f299c2944a6ba4d1303a189b6ef5b34d3bf647c5c7a1e6e35bee8079be6 WatchSource:0}: Error finding container 36ca0f299c2944a6ba4d1303a189b6ef5b34d3bf647c5c7a1e6e35bee8079be6: Status 404 returned error can't find the container with id 36ca0f299c2944a6ba4d1303a189b6ef5b34d3bf647c5c7a1e6e35bee8079be6 Feb 27 06:30:13 crc kubenswrapper[4725]: I0227 06:30:13.092514 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bq24l" event={"ID":"40a2ae59-8725-42be-984a-739a82d476c5","Type":"ContainerStarted","Data":"36ca0f299c2944a6ba4d1303a189b6ef5b34d3bf647c5c7a1e6e35bee8079be6"} Feb 27 06:30:13 crc kubenswrapper[4725]: I0227 06:30:13.094759 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"67b7afed-e3d9-42c8-9604-9d9e56f1bc1d","Type":"ContainerStarted","Data":"5e4f48be6abda4c99d7b26cf89ba4beba5976282b7a39ccb948c0477bf26b23f"} Feb 27 06:30:13 crc kubenswrapper[4725]: I0227 06:30:13.119640 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.623555136 podStartE2EDuration="41.11962206s" podCreationTimestamp="2026-02-27 06:29:32 +0000 UTC" firstStartedPulling="2026-02-27 06:29:39.858133198 +0000 UTC m=+1158.320753767" lastFinishedPulling="2026-02-27 06:30:12.354200112 +0000 UTC m=+1190.816820691" observedRunningTime="2026-02-27 06:30:13.11534478 +0000 UTC m=+1191.577965359" watchObservedRunningTime="2026-02-27 06:30:13.11962206 +0000 UTC m=+1191.582242629" Feb 27 06:30:13 crc kubenswrapper[4725]: I0227 06:30:13.853361 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 27 06:30:14 crc kubenswrapper[4725]: I0227 06:30:14.974975 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:14 crc kubenswrapper[4725]: E0227 06:30:14.975470 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 06:30:14 crc kubenswrapper[4725]: E0227 06:30:14.975599 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 06:30:14 crc kubenswrapper[4725]: E0227 06:30:14.975658 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift podName:872eba69-b1d2-4028-b65f-b70fa14daeb0 nodeName:}" failed. No retries permitted until 2026-02-27 06:30:22.97564316 +0000 UTC m=+1201.438263729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift") pod "swift-storage-0" (UID: "872eba69-b1d2-4028-b65f-b70fa14daeb0") : configmap "swift-ring-files" not found Feb 27 06:30:15 crc kubenswrapper[4725]: I0227 06:30:15.109818 4725 generic.go:334] "Generic (PLEG): container finished" podID="7f24b5c8-baad-48b6-9242-2ad6bb6c471f" containerID="72092b2c7a382f04ec46d80835ca5466b7cc25cb21a103ed98dfc2ad8a23a663" exitCode=0 Feb 27 06:30:15 crc kubenswrapper[4725]: I0227 06:30:15.109878 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7f24b5c8-baad-48b6-9242-2ad6bb6c471f","Type":"ContainerDied","Data":"72092b2c7a382f04ec46d80835ca5466b7cc25cb21a103ed98dfc2ad8a23a663"} Feb 27 06:30:15 crc kubenswrapper[4725]: I0227 06:30:15.853962 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 27 06:30:15 crc kubenswrapper[4725]: I0227 06:30:15.926364 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 27 06:30:16 crc kubenswrapper[4725]: I0227 06:30:16.018549 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 06:30:16 crc kubenswrapper[4725]: I0227 06:30:16.129153 4725 generic.go:334] "Generic (PLEG): container finished" podID="76702ae7-c9e6-485b-abc9-b54e4c073ee1" containerID="a32b31db332adaa777b64c4359972047866cd3b7da2f19eeccb0da4a4a785a5a" exitCode=0 Feb 27 06:30:16 crc kubenswrapper[4725]: I0227 06:30:16.129275 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76702ae7-c9e6-485b-abc9-b54e4c073ee1","Type":"ContainerDied","Data":"a32b31db332adaa777b64c4359972047866cd3b7da2f19eeccb0da4a4a785a5a"} Feb 27 06:30:16 crc kubenswrapper[4725]: I0227 06:30:16.365547 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:16 crc kubenswrapper[4725]: I0227 06:30:16.413080 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c6656465-qtdhm"] Feb 27 06:30:16 crc kubenswrapper[4725]: I0227 06:30:16.413310 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerName="dnsmasq-dns" containerID="cri-o://69b1085067c728d7192bc6eaf76f3f4c6c15a9b088a01e3f18c1ff2b5a982947" gracePeriod=10 Feb 27 06:30:18 crc kubenswrapper[4725]: I0227 06:30:18.898921 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.161862 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c8655694f-kqd5v"] Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.166689 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.168705 4725 generic.go:334] "Generic (PLEG): container finished" podID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerID="69b1085067c728d7192bc6eaf76f3f4c6c15a9b088a01e3f18c1ff2b5a982947" exitCode=0 Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.168839 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" event={"ID":"53bf2b68-ae88-4d0a-af2e-2b4a67e35257","Type":"ContainerDied","Data":"69b1085067c728d7192bc6eaf76f3f4c6c15a9b088a01e3f18c1ff2b5a982947"} Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.173824 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.242280 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8655694f-kqd5v"] Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.251996 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-s2ht5"] Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.253083 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.255190 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.258976 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-config\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.259059 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-dns-svc\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.259094 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.259118 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx2p4\" (UniqueName: \"kubernetes.io/projected/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-kube-api-access-kx2p4\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.283010 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-s2ht5"] Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.360876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-dns-svc\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.360942 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474555a6-7d91-4881-a4c7-785ccf8185cc-config\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.360983 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474555a6-7d91-4881-a4c7-785ccf8185cc-ovn-rundir\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.361008 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.361040 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4fg4\" (UniqueName: \"kubernetes.io/projected/474555a6-7d91-4881-a4c7-785ccf8185cc-kube-api-access-s4fg4\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.361071 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx2p4\" (UniqueName: \"kubernetes.io/projected/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-kube-api-access-kx2p4\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.361091 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474555a6-7d91-4881-a4c7-785ccf8185cc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.362027 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.362225 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474555a6-7d91-4881-a4c7-785ccf8185cc-ovs-rundir\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.362326 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474555a6-7d91-4881-a4c7-785ccf8185cc-combined-ca-bundle\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.362420 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-config\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.362494 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-dns-svc\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.364019 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-config\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.387073 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx2p4\" (UniqueName: \"kubernetes.io/projected/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-kube-api-access-kx2p4\") pod \"dnsmasq-dns-7c8655694f-kqd5v\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.464299 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474555a6-7d91-4881-a4c7-785ccf8185cc-ovn-rundir\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.464354 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4fg4\" (UniqueName: \"kubernetes.io/projected/474555a6-7d91-4881-a4c7-785ccf8185cc-kube-api-access-s4fg4\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.464377 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474555a6-7d91-4881-a4c7-785ccf8185cc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.464427 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474555a6-7d91-4881-a4c7-785ccf8185cc-ovs-rundir\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.464462 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474555a6-7d91-4881-a4c7-785ccf8185cc-combined-ca-bundle\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.464547 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474555a6-7d91-4881-a4c7-785ccf8185cc-config\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.465202 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474555a6-7d91-4881-a4c7-785ccf8185cc-ovn-rundir\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.465277 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474555a6-7d91-4881-a4c7-785ccf8185cc-ovs-rundir\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.465315 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474555a6-7d91-4881-a4c7-785ccf8185cc-config\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.488267 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474555a6-7d91-4881-a4c7-785ccf8185cc-combined-ca-bundle\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.493843 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4fg4\" (UniqueName: \"kubernetes.io/projected/474555a6-7d91-4881-a4c7-785ccf8185cc-kube-api-access-s4fg4\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.494528 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474555a6-7d91-4881-a4c7-785ccf8185cc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-s2ht5\" (UID: \"474555a6-7d91-4881-a4c7-785ccf8185cc\") " pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.496246 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8655694f-kqd5v"] Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.496951 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.529329 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444c9d757-pr69s"] Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.534851 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.540877 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.552277 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444c9d757-pr69s"] Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.574675 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-s2ht5" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.667357 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-config\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.667413 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-sb\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.667436 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz2qt\" (UniqueName: \"kubernetes.io/projected/aa2e7bcc-3655-4f34-8f6b-1cc325681122-kube-api-access-qz2qt\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.667508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-dns-svc\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.667547 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-nb\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.773248 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-config\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.771894 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-config\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.773358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-sb\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.773378 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz2qt\" (UniqueName: \"kubernetes.io/projected/aa2e7bcc-3655-4f34-8f6b-1cc325681122-kube-api-access-qz2qt\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.778996 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-sb\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.780492 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-dns-svc\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.780620 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-nb\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.781478 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-nb\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.781841 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-dns-svc\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.804077 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz2qt\" (UniqueName: \"kubernetes.io/projected/aa2e7bcc-3655-4f34-8f6b-1cc325681122-kube-api-access-qz2qt\") pod \"dnsmasq-dns-6444c9d757-pr69s\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:19 crc kubenswrapper[4725]: I0227 06:30:19.926357 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.046712 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:23 crc kubenswrapper[4725]: E0227 06:30:23.047882 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 06:30:23 crc kubenswrapper[4725]: E0227 06:30:23.047911 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 06:30:23 crc kubenswrapper[4725]: E0227 06:30:23.047981 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift podName:872eba69-b1d2-4028-b65f-b70fa14daeb0 nodeName:}" failed. No retries permitted until 2026-02-27 06:30:39.047958849 +0000 UTC m=+1217.510579418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift") pod "swift-storage-0" (UID: "872eba69-b1d2-4028-b65f-b70fa14daeb0") : configmap "swift-ring-files" not found Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.382797 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:30:23 crc kubenswrapper[4725]: E0227 06:30:23.442513 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-swift-proxy-server:watcher_latest" Feb 27 06:30:23 crc kubenswrapper[4725]: E0227 06:30:23.442568 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-swift-proxy-server:watcher_latest" Feb 27 06:30:23 crc kubenswrapper[4725]: E0227 06:30:23.442688 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:swift-ring-rebalance,Image:38.102.83.203:5001/podified-master-centos10/openstack-swift-proxy-server:watcher_latest,Command:[/usr/local/bin/swift-ring-tool all],Args:[],WorkingDir:/etc/swift,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CM_NAME,Value:swift-ring-files,ValueFrom:nil,},EnvVar{Name:NAMESPACE,Value:openstack,ValueFrom:nil,},EnvVar{Name:OWNER_APIVERSION,Value:swift.openstack.org/v1beta1,ValueFrom:nil,},EnvVar{Name:OWNER_KIND,Value:SwiftRing,ValueFrom:nil,},EnvVar{Name:OWNER_NAME,Value:swift-ring,ValueFrom:nil,},EnvVar{Name:OWNER_UID,Value:f6255343-1006-48c2-abcd-1f82bd704eab,ValueFrom:nil,},EnvVar{Name:SWIFT_MIN_PART_HOURS,Value:1,ValueFrom:nil,},EnvVar{Name:SWIFT_PART_POWER,Value:10,ValueFrom:nil,},EnvVar{Name:SWIFT_REPLICAS,Value:1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/swift-ring-tool,SubPath:swift-ring-tool,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:swiftconf,ReadOnly:true,MountPath:/etc/swift/swift.conf,SubPath:swift.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ring-data-devices,ReadOnly:true,MountPath:/var/lib/config-data/ring-devices,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dispersionconf,ReadOnly:true,MountPath:/etc/swift/dispersion.conf,SubPath:dispersion.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5j65j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-ring-rebalance-bq24l_openstack(40a2ae59-8725-42be-984a-739a82d476c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:30:23 crc kubenswrapper[4725]: E0227 06:30:23.443930 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/swift-ring-rebalance-bq24l" podUID="40a2ae59-8725-42be-984a-739a82d476c5" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.451837 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-config\") pod \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.451906 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4xt9\" (UniqueName: \"kubernetes.io/projected/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-kube-api-access-k4xt9\") pod \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.452022 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-dns-svc\") pod \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\" (UID: \"53bf2b68-ae88-4d0a-af2e-2b4a67e35257\") " Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.457467 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-kube-api-access-k4xt9" (OuterVolumeSpecName: "kube-api-access-k4xt9") pod "53bf2b68-ae88-4d0a-af2e-2b4a67e35257" (UID: "53bf2b68-ae88-4d0a-af2e-2b4a67e35257"). InnerVolumeSpecName "kube-api-access-k4xt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.491727 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-config" (OuterVolumeSpecName: "config") pod "53bf2b68-ae88-4d0a-af2e-2b4a67e35257" (UID: "53bf2b68-ae88-4d0a-af2e-2b4a67e35257"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.497192 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53bf2b68-ae88-4d0a-af2e-2b4a67e35257" (UID: "53bf2b68-ae88-4d0a-af2e-2b4a67e35257"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.553799 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.553831 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.553845 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4xt9\" (UniqueName: \"kubernetes.io/projected/53bf2b68-ae88-4d0a-af2e-2b4a67e35257-kube-api-access-k4xt9\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:23 crc kubenswrapper[4725]: I0227 06:30:23.993941 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444c9d757-pr69s"] Feb 27 06:30:24 crc kubenswrapper[4725]: W0227 06:30:24.001158 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa2e7bcc_3655_4f34_8f6b_1cc325681122.slice/crio-b5b60f0bbf1301497a8acd12403a2147fff7747f2d0ff2c5b834b2bebc81ae13 WatchSource:0}: Error finding container b5b60f0bbf1301497a8acd12403a2147fff7747f2d0ff2c5b834b2bebc81ae13: Status 404 returned error can't find the container with id b5b60f0bbf1301497a8acd12403a2147fff7747f2d0ff2c5b834b2bebc81ae13 Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.047906 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-s2ht5"] Feb 27 06:30:24 crc kubenswrapper[4725]: W0227 06:30:24.105394 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04fce22e_2dc9_4c64_85d9_d7f38f23d4b9.slice/crio-85d27197b323aaf31d0495e6333cdb9783d551066ab636d092fcfa293b7d5ad0 WatchSource:0}: Error finding container 85d27197b323aaf31d0495e6333cdb9783d551066ab636d092fcfa293b7d5ad0: Status 404 returned error can't find the container with id 85d27197b323aaf31d0495e6333cdb9783d551066ab636d092fcfa293b7d5ad0 Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.109160 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8655694f-kqd5v"] Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.230940 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerStarted","Data":"ae592e258d79cb09c8b03281de57f0d5e3f84ba2c19b27c712b185ab71c40bbf"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.232434 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76702ae7-c9e6-485b-abc9-b54e4c073ee1","Type":"ContainerStarted","Data":"6754455aacf02d0e1e30bfb49c3d9568b3235828a3294eec9371c1d39d85e466"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.234493 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" event={"ID":"53bf2b68-ae88-4d0a-af2e-2b4a67e35257","Type":"ContainerDied","Data":"b6ab525b7bddefa16e7d5ec40b8ce52e4d2a9062d9ab44d410f35207bc11091d"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.234533 4725 scope.go:117] "RemoveContainer" containerID="69b1085067c728d7192bc6eaf76f3f4c6c15a9b088a01e3f18c1ff2b5a982947" Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.234537 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.235220 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-s2ht5" event={"ID":"474555a6-7d91-4881-a4c7-785ccf8185cc","Type":"ContainerStarted","Data":"a627971a54c5e619c0de646b0533261c3a606d31fe6fd0c4794d54bad039f968"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.237103 4725 generic.go:334] "Generic (PLEG): container finished" podID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerID="3ce4aa8455e0478d967e778306adbf24b533de04af0ffe96be00d005cbc134a7" exitCode=0 Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.237163 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" event={"ID":"aa2e7bcc-3655-4f34-8f6b-1cc325681122","Type":"ContainerDied","Data":"3ce4aa8455e0478d967e778306adbf24b533de04af0ffe96be00d005cbc134a7"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.237181 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" event={"ID":"aa2e7bcc-3655-4f34-8f6b-1cc325681122","Type":"ContainerStarted","Data":"b5b60f0bbf1301497a8acd12403a2147fff7747f2d0ff2c5b834b2bebc81ae13"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.238118 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" event={"ID":"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9","Type":"ContainerStarted","Data":"85d27197b323aaf31d0495e6333cdb9783d551066ab636d092fcfa293b7d5ad0"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.244950 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8a3fa421-de83-44cb-8857-ef6f679f37dc","Type":"ContainerStarted","Data":"b9b6fc4939b314996aedd694fe0ee70ad5087b53e923bcd409b35df42fed675f"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.251630 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371974.603163 podStartE2EDuration="1m2.251612878s" podCreationTimestamp="2026-02-27 06:29:22 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.412886504 +0000 UTC m=+1158.875507063" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:24.249482838 +0000 UTC m=+1202.712103407" watchObservedRunningTime="2026-02-27 06:30:24.251612878 +0000 UTC m=+1202.714233447" Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.270384 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7f24b5c8-baad-48b6-9242-2ad6bb6c471f","Type":"ContainerStarted","Data":"8153b4b59737722512631df74ac2e8b7838022191b2267764d3c5feca9181e37"} Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.277892 4725 scope.go:117] "RemoveContainer" containerID="32a36bcb5697cf7fbd40ee05ddd63690e34560a52c053f76c1e6673439cccfe1" Feb 27 06:30:24 crc kubenswrapper[4725]: E0227 06:30:24.277949 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"swift-ring-rebalance\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-swift-proxy-server:watcher_latest\\\"\"" pod="openstack/swift-ring-rebalance-bq24l" podUID="40a2ae59-8725-42be-984a-739a82d476c5" Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.334263 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.189428685 podStartE2EDuration="52.334248751s" podCreationTimestamp="2026-02-27 06:29:32 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.405529757 +0000 UTC m=+1158.868150336" lastFinishedPulling="2026-02-27 06:30:23.550349833 +0000 UTC m=+1202.012970402" observedRunningTime="2026-02-27 06:30:24.306349527 +0000 UTC m=+1202.768970106" watchObservedRunningTime="2026-02-27 06:30:24.334248751 +0000 UTC m=+1202.796869320" Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.339699 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=39.745547459 podStartE2EDuration="1m4.339687174s" podCreationTimestamp="2026-02-27 06:29:20 +0000 UTC" firstStartedPulling="2026-02-27 06:29:39.728841643 +0000 UTC m=+1158.191462212" lastFinishedPulling="2026-02-27 06:30:04.322981348 +0000 UTC m=+1182.785601927" observedRunningTime="2026-02-27 06:30:24.334168519 +0000 UTC m=+1202.796789078" watchObservedRunningTime="2026-02-27 06:30:24.339687174 +0000 UTC m=+1202.802307743" Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.377184 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c6656465-qtdhm"] Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.383367 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9c6656465-qtdhm"] Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.575471 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9c6656465-qtdhm" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.106:5353: i/o timeout" Feb 27 06:30:24 crc kubenswrapper[4725]: I0227 06:30:24.714927 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.276215 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-s2ht5" event={"ID":"474555a6-7d91-4881-a4c7-785ccf8185cc","Type":"ContainerStarted","Data":"92056497d0e64015b2072a2f5bc2cc6cb2e78ffa113c038eb75b00ef2c5cae88"} Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.281722 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" event={"ID":"aa2e7bcc-3655-4f34-8f6b-1cc325681122","Type":"ContainerStarted","Data":"c83f0cda61e0eeca3c659f093826b055b5486959ac3ce506b78bc7b5675e6502"} Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.281846 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.284569 4725 generic.go:334] "Generic (PLEG): container finished" podID="04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" containerID="ed344764d634a73a47a6bfe8db2b655d25c77383d0b3947fcfcee1d245dd1178" exitCode=0 Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.286244 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" event={"ID":"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9","Type":"ContainerDied","Data":"ed344764d634a73a47a6bfe8db2b655d25c77383d0b3947fcfcee1d245dd1178"} Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.321725 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-s2ht5" podStartSLOduration=6.321701762 podStartE2EDuration="6.321701762s" podCreationTimestamp="2026-02-27 06:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:25.296241267 +0000 UTC m=+1203.758861876" watchObservedRunningTime="2026-02-27 06:30:25.321701762 +0000 UTC m=+1203.784322341" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.666414 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.690536 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" podStartSLOduration=6.690514351 podStartE2EDuration="6.690514351s" podCreationTimestamp="2026-02-27 06:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:25.348565768 +0000 UTC m=+1203.811186347" watchObservedRunningTime="2026-02-27 06:30:25.690514351 +0000 UTC m=+1204.153134920" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.808894 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-ovsdbserver-nb\") pod \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.809216 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx2p4\" (UniqueName: \"kubernetes.io/projected/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-kube-api-access-kx2p4\") pod \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.809397 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-dns-svc\") pod \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.809710 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-config\") pod \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\" (UID: \"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9\") " Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.829759 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" (UID: "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.831890 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" (UID: "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.841000 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-config" (OuterVolumeSpecName: "config") pod "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" (UID: "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.912321 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.912356 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:25 crc kubenswrapper[4725]: I0227 06:30:25.912369 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.075349 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-kube-api-access-kx2p4" (OuterVolumeSpecName: "kube-api-access-kx2p4") pod "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" (UID: "04fce22e-2dc9-4c64-85d9-d7f38f23d4b9"). InnerVolumeSpecName "kube-api-access-kx2p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.116129 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx2p4\" (UniqueName: \"kubernetes.io/projected/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9-kube-api-access-kx2p4\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:26 crc kubenswrapper[4725]: E0227 06:30:26.263964 4725 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.192:38240->38.102.83.192:37635: write tcp 38.102.83.192:38240->38.102.83.192:37635: write: broken pipe Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.269193 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" path="/var/lib/kubelet/pods/53bf2b68-ae88-4d0a-af2e-2b4a67e35257/volumes" Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.309242 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" event={"ID":"04fce22e-2dc9-4c64-85d9-d7f38f23d4b9","Type":"ContainerDied","Data":"85d27197b323aaf31d0495e6333cdb9783d551066ab636d092fcfa293b7d5ad0"} Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.309322 4725 scope.go:117] "RemoveContainer" containerID="ed344764d634a73a47a6bfe8db2b655d25c77383d0b3947fcfcee1d245dd1178" Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.309636 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8655694f-kqd5v" Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.373204 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8655694f-kqd5v"] Feb 27 06:30:26 crc kubenswrapper[4725]: I0227 06:30:26.379903 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c8655694f-kqd5v"] Feb 27 06:30:27 crc kubenswrapper[4725]: I0227 06:30:27.320028 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerStarted","Data":"e63a4b09ef3287ec49a1d044f8543c0f227df5129cec65d081f31d4a607b09cb"} Feb 27 06:30:27 crc kubenswrapper[4725]: I0227 06:30:27.773527 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 27 06:30:27 crc kubenswrapper[4725]: I0227 06:30:27.774206 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.265263 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" path="/var/lib/kubelet/pods/04fce22e-2dc9-4c64-85d9-d7f38f23d4b9/volumes" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.368954 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.589468 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 27 06:30:28 crc kubenswrapper[4725]: E0227 06:30:28.589953 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerName="dnsmasq-dns" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.589968 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerName="dnsmasq-dns" Feb 27 06:30:28 crc kubenswrapper[4725]: E0227 06:30:28.589980 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" containerName="init" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.589986 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" containerName="init" Feb 27 06:30:28 crc kubenswrapper[4725]: E0227 06:30:28.590010 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerName="init" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.590016 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerName="init" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.590152 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bf2b68-ae88-4d0a-af2e-2b4a67e35257" containerName="dnsmasq-dns" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.590164 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fce22e-2dc9-4c64-85d9-d7f38f23d4b9" containerName="init" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.597133 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.604262 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.607765 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.607809 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.607914 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.608014 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-v7lv7" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.774796 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2q4\" (UniqueName: \"kubernetes.io/projected/037dd431-5912-4101-9895-0a6d11e627a6-kube-api-access-9h2q4\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.774856 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037dd431-5912-4101-9895-0a6d11e627a6-scripts\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.774887 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/037dd431-5912-4101-9895-0a6d11e627a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.774909 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.774992 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.775025 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037dd431-5912-4101-9895-0a6d11e627a6-config\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.775043 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.876916 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2q4\" (UniqueName: \"kubernetes.io/projected/037dd431-5912-4101-9895-0a6d11e627a6-kube-api-access-9h2q4\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.876967 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037dd431-5912-4101-9895-0a6d11e627a6-scripts\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.876993 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/037dd431-5912-4101-9895-0a6d11e627a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.877012 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.877069 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.877093 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037dd431-5912-4101-9895-0a6d11e627a6-config\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.877106 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.877559 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/037dd431-5912-4101-9895-0a6d11e627a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.878326 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037dd431-5912-4101-9895-0a6d11e627a6-config\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.878396 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/037dd431-5912-4101-9895-0a6d11e627a6-scripts\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.883718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.886892 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.899710 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/037dd431-5912-4101-9895-0a6d11e627a6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.900066 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2q4\" (UniqueName: \"kubernetes.io/projected/037dd431-5912-4101-9895-0a6d11e627a6-kube-api-access-9h2q4\") pod \"ovn-northd-0\" (UID: \"037dd431-5912-4101-9895-0a6d11e627a6\") " pod="openstack/ovn-northd-0" Feb 27 06:30:28 crc kubenswrapper[4725]: I0227 06:30:28.932796 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.442355 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.630423 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.636467 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zvrdk" Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.867972 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6kvbc-config-blgk2"] Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.882171 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.889171 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.891197 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kvbc-config-blgk2"] Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.929450 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.986023 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4746dfb9-lgz99"] Feb 27 06:30:29 crc kubenswrapper[4725]: I0227 06:30:29.986232 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" podUID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerName="dnsmasq-dns" containerID="cri-o://e446b1467990d0319f6fa5f6e6829d0195dec81e99b0e90ae70e2c12e92ec51b" gracePeriod=10 Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.007474 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdgc\" (UniqueName: \"kubernetes.io/projected/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-kube-api-access-dkdgc\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.007542 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-log-ovn\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.007593 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.007679 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-additional-scripts\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.007720 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run-ovn\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.009393 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-scripts\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-additional-scripts\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111411 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run-ovn\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111440 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-scripts\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111500 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdgc\" (UniqueName: \"kubernetes.io/projected/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-kube-api-access-dkdgc\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111531 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-log-ovn\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111573 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111942 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-log-ovn\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.111987 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run-ovn\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.112029 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.114190 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-scripts\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.114727 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-additional-scripts\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.132158 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdgc\" (UniqueName: \"kubernetes.io/projected/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-kube-api-access-dkdgc\") pod \"ovn-controller-6kvbc-config-blgk2\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.230272 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.343605 4725 generic.go:334] "Generic (PLEG): container finished" podID="bc2bb345-ef60-4c05-8461-1821e1db5216" containerID="099bf6ddcc61dc4f4862edbdcaba5b56beff3b6683678f541068a40c9e58e264" exitCode=0 Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.343673 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"bc2bb345-ef60-4c05-8461-1821e1db5216","Type":"ContainerDied","Data":"099bf6ddcc61dc4f4862edbdcaba5b56beff3b6683678f541068a40c9e58e264"} Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.351436 4725 generic.go:334] "Generic (PLEG): container finished" podID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerID="e446b1467990d0319f6fa5f6e6829d0195dec81e99b0e90ae70e2c12e92ec51b" exitCode=0 Feb 27 06:30:30 crc kubenswrapper[4725]: I0227 06:30:30.351514 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" event={"ID":"a73dee1c-bbf2-4637-b618-ee9baee91212","Type":"ContainerDied","Data":"e446b1467990d0319f6fa5f6e6829d0195dec81e99b0e90ae70e2c12e92ec51b"} Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.171773 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.334059 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6kvbc-config-blgk2"] Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.336616 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-config\") pod \"a73dee1c-bbf2-4637-b618-ee9baee91212\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.336795 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-dns-svc\") pod \"a73dee1c-bbf2-4637-b618-ee9baee91212\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.336829 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7s25\" (UniqueName: \"kubernetes.io/projected/a73dee1c-bbf2-4637-b618-ee9baee91212-kube-api-access-f7s25\") pod \"a73dee1c-bbf2-4637-b618-ee9baee91212\" (UID: \"a73dee1c-bbf2-4637-b618-ee9baee91212\") " Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.351940 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73dee1c-bbf2-4637-b618-ee9baee91212-kube-api-access-f7s25" (OuterVolumeSpecName: "kube-api-access-f7s25") pod "a73dee1c-bbf2-4637-b618-ee9baee91212" (UID: "a73dee1c-bbf2-4637-b618-ee9baee91212"). InnerVolumeSpecName "kube-api-access-f7s25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.363753 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" event={"ID":"a73dee1c-bbf2-4637-b618-ee9baee91212","Type":"ContainerDied","Data":"87ff56aa3c149ec25a1d4234be53c17ea3edd83e90cd261e9ef4142bdd103821"} Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.363771 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4746dfb9-lgz99" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.363803 4725 scope.go:117] "RemoveContainer" containerID="e446b1467990d0319f6fa5f6e6829d0195dec81e99b0e90ae70e2c12e92ec51b" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.367575 4725 generic.go:334] "Generic (PLEG): container finished" podID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerID="49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2" exitCode=0 Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.367881 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb0a4e8-6f65-4961-9caa-18d66a6754af","Type":"ContainerDied","Data":"49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2"} Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.374711 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerStarted","Data":"63eb8bad680b557d83340805c294e94eb30b1a95bb7f30a52be57834007a6586"} Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.376135 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"037dd431-5912-4101-9895-0a6d11e627a6","Type":"ContainerStarted","Data":"ef23c5d0c02ec3cc92dce154bc8488e1451c051749d734b1cfef844ea2df049f"} Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.378647 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"bc2bb345-ef60-4c05-8461-1821e1db5216","Type":"ContainerStarted","Data":"a4a4d68bdeec96895f96b8fa92c89330adee113874fce68512957525945d961f"} Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.378801 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.382479 4725 generic.go:334] "Generic (PLEG): container finished" podID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerID="4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b" exitCode=0 Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.382516 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ac67077-5fb4-4890-98ba-f5280a08e464","Type":"ContainerDied","Data":"4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b"} Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.387398 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a73dee1c-bbf2-4637-b618-ee9baee91212" (UID: "a73dee1c-bbf2-4637-b618-ee9baee91212"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.404309 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-config" (OuterVolumeSpecName: "config") pod "a73dee1c-bbf2-4637-b618-ee9baee91212" (UID: "a73dee1c-bbf2-4637-b618-ee9baee91212"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.438271 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.916314698 podStartE2EDuration="1m6.438228048s" podCreationTimestamp="2026-02-27 06:29:25 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.391556725 +0000 UTC m=+1158.854177294" lastFinishedPulling="2026-02-27 06:30:30.913470075 +0000 UTC m=+1209.376090644" observedRunningTime="2026-02-27 06:30:31.418286757 +0000 UTC m=+1209.880907336" watchObservedRunningTime="2026-02-27 06:30:31.438228048 +0000 UTC m=+1209.900848627" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.440379 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.440618 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a73dee1c-bbf2-4637-b618-ee9baee91212-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.440701 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7s25\" (UniqueName: \"kubernetes.io/projected/a73dee1c-bbf2-4637-b618-ee9baee91212-kube-api-access-f7s25\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.667842 4725 scope.go:117] "RemoveContainer" containerID="a5e544fb659f275201a7603338ebcccc2fc5131665a168ab740c8050dab7cfc4" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.707563 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=59.22330365 podStartE2EDuration="1m13.70754545s" podCreationTimestamp="2026-02-27 06:29:18 +0000 UTC" firstStartedPulling="2026-02-27 06:29:39.720796368 +0000 UTC m=+1158.183416937" lastFinishedPulling="2026-02-27 06:29:54.205038168 +0000 UTC m=+1172.667658737" observedRunningTime="2026-02-27 06:30:31.472003358 +0000 UTC m=+1209.934623937" watchObservedRunningTime="2026-02-27 06:30:31.70754545 +0000 UTC m=+1210.170166039" Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.714604 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4746dfb9-lgz99"] Feb 27 06:30:31 crc kubenswrapper[4725]: I0227 06:30:31.724249 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d4746dfb9-lgz99"] Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.276573 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73dee1c-bbf2-4637-b618-ee9baee91212" path="/var/lib/kubelet/pods/a73dee1c-bbf2-4637-b618-ee9baee91212/volumes" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.308026 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.326952 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.327008 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.392577 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kvbc-config-blgk2" event={"ID":"7ec8dec9-e895-45fb-b458-33df8c4fd4ec","Type":"ContainerStarted","Data":"a6361d62d111ad5e17d07b994b50dc79f8c362ecc94e02190c469c11966749d9"} Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.392618 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kvbc-config-blgk2" event={"ID":"7ec8dec9-e895-45fb-b458-33df8c4fd4ec","Type":"ContainerStarted","Data":"a5583cb045512e39029404f9a06c62392959e5766b5c8a4419037f0cb4a2ec03"} Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.395057 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ac67077-5fb4-4890-98ba-f5280a08e464","Type":"ContainerStarted","Data":"52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7"} Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.395328 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.398606 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb0a4e8-6f65-4961-9caa-18d66a6754af","Type":"ContainerStarted","Data":"de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2"} Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.399733 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.437490 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6kvbc-config-blgk2" podStartSLOduration=3.43746629 podStartE2EDuration="3.43746629s" podCreationTimestamp="2026-02-27 06:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:32.428813717 +0000 UTC m=+1210.891434296" watchObservedRunningTime="2026-02-27 06:30:32.43746629 +0000 UTC m=+1210.900086859" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.515882 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=59.594613241 podStartE2EDuration="1m13.515863324s" podCreationTimestamp="2026-02-27 06:29:19 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.394416045 +0000 UTC m=+1158.857036614" lastFinishedPulling="2026-02-27 06:29:54.315666128 +0000 UTC m=+1172.778286697" observedRunningTime="2026-02-27 06:30:32.483420832 +0000 UTC m=+1210.946041421" watchObservedRunningTime="2026-02-27 06:30:32.515863324 +0000 UTC m=+1210.978483893" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.519311 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=58.691151591 podStartE2EDuration="1m13.51927895s" podCreationTimestamp="2026-02-27 06:29:19 +0000 UTC" firstStartedPulling="2026-02-27 06:29:40.397414929 +0000 UTC m=+1158.860035498" lastFinishedPulling="2026-02-27 06:29:55.225542288 +0000 UTC m=+1173.688162857" observedRunningTime="2026-02-27 06:30:32.514392243 +0000 UTC m=+1210.977012842" watchObservedRunningTime="2026-02-27 06:30:32.51927895 +0000 UTC m=+1210.981899519" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.584778 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 27 06:30:32 crc kubenswrapper[4725]: I0227 06:30:32.697914 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.408987 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"037dd431-5912-4101-9895-0a6d11e627a6","Type":"ContainerStarted","Data":"78e02dcaaf45f85e6bc5c936739feb4da33d1136ae9dc27f5b2ddf8267a48adf"} Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.409339 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.409352 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"037dd431-5912-4101-9895-0a6d11e627a6","Type":"ContainerStarted","Data":"4f366f4a8fe29679b678725ca67cf3e9232124168558b25e82f303bc0d2b0452"} Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.411117 4725 generic.go:334] "Generic (PLEG): container finished" podID="7ec8dec9-e895-45fb-b458-33df8c4fd4ec" containerID="a6361d62d111ad5e17d07b994b50dc79f8c362ecc94e02190c469c11966749d9" exitCode=0 Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.411181 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kvbc-config-blgk2" event={"ID":"7ec8dec9-e895-45fb-b458-33df8c4fd4ec","Type":"ContainerDied","Data":"a6361d62d111ad5e17d07b994b50dc79f8c362ecc94e02190c469c11966749d9"} Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.435152 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.062684882 podStartE2EDuration="5.435129967s" podCreationTimestamp="2026-02-27 06:30:28 +0000 UTC" firstStartedPulling="2026-02-27 06:30:30.782532034 +0000 UTC m=+1209.245152603" lastFinishedPulling="2026-02-27 06:30:32.154977119 +0000 UTC m=+1210.617597688" observedRunningTime="2026-02-27 06:30:33.428757788 +0000 UTC m=+1211.891378377" watchObservedRunningTime="2026-02-27 06:30:33.435129967 +0000 UTC m=+1211.897750546" Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.784181 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.784226 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 27 06:30:33 crc kubenswrapper[4725]: I0227 06:30:33.903386 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.305979 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ssnn8"] Feb 27 06:30:34 crc kubenswrapper[4725]: E0227 06:30:34.306359 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerName="init" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.306375 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerName="init" Feb 27 06:30:34 crc kubenswrapper[4725]: E0227 06:30:34.306386 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerName="dnsmasq-dns" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.306393 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerName="dnsmasq-dns" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.306599 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73dee1c-bbf2-4637-b618-ee9baee91212" containerName="dnsmasq-dns" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.307213 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.321970 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ssnn8"] Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.389655 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsqt8\" (UniqueName: \"kubernetes.io/projected/5a31f27e-dadb-461c-a614-77cc108a550f-kube-api-access-wsqt8\") pod \"glance-db-create-ssnn8\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.389752 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a31f27e-dadb-461c-a614-77cc108a550f-operator-scripts\") pod \"glance-db-create-ssnn8\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.391349 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e1ec-account-create-update-qv24h"] Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.392530 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.394610 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.403148 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e1ec-account-create-update-qv24h"] Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.491167 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsqt8\" (UniqueName: \"kubernetes.io/projected/5a31f27e-dadb-461c-a614-77cc108a550f-kube-api-access-wsqt8\") pod \"glance-db-create-ssnn8\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.491212 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bb37bd-657c-48b6-9ed9-7039b6e7211f-operator-scripts\") pod \"glance-e1ec-account-create-update-qv24h\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.491250 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a31f27e-dadb-461c-a614-77cc108a550f-operator-scripts\") pod \"glance-db-create-ssnn8\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.491304 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7b9t\" (UniqueName: \"kubernetes.io/projected/06bb37bd-657c-48b6-9ed9-7039b6e7211f-kube-api-access-r7b9t\") pod \"glance-e1ec-account-create-update-qv24h\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.493812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a31f27e-dadb-461c-a614-77cc108a550f-operator-scripts\") pod \"glance-db-create-ssnn8\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:34 crc kubenswrapper[4725]: I0227 06:30:34.533256 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsqt8\" (UniqueName: \"kubernetes.io/projected/5a31f27e-dadb-461c-a614-77cc108a550f-kube-api-access-wsqt8\") pod \"glance-db-create-ssnn8\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.620170 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7b9t\" (UniqueName: \"kubernetes.io/projected/06bb37bd-657c-48b6-9ed9-7039b6e7211f-kube-api-access-r7b9t\") pod \"glance-e1ec-account-create-update-qv24h\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.621713 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bb37bd-657c-48b6-9ed9-7039b6e7211f-operator-scripts\") pod \"glance-e1ec-account-create-update-qv24h\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.623177 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bb37bd-657c-48b6-9ed9-7039b6e7211f-operator-scripts\") pod \"glance-e1ec-account-create-update-qv24h\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.639747 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.650629 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7b9t\" (UniqueName: \"kubernetes.io/projected/06bb37bd-657c-48b6-9ed9-7039b6e7211f-kube-api-access-r7b9t\") pod \"glance-e1ec-account-create-update-qv24h\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.653740 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.709673 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.860635 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.916937 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-x6dd6"] Feb 27 06:30:35 crc kubenswrapper[4725]: E0227 06:30:34.917268 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec8dec9-e895-45fb-b458-33df8c4fd4ec" containerName="ovn-config" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.917278 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec8dec9-e895-45fb-b458-33df8c4fd4ec" containerName="ovn-config" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.917759 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec8dec9-e895-45fb-b458-33df8c4fd4ec" containerName="ovn-config" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.918350 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:34.971061 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-x6dd6"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.030601 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run\") pod \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.030647 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run-ovn\") pod \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.030808 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-scripts\") pod \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.030832 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-log-ovn\") pod \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.030904 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkdgc\" (UniqueName: \"kubernetes.io/projected/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-kube-api-access-dkdgc\") pod \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.030935 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-additional-scripts\") pod \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\" (UID: \"7ec8dec9-e895-45fb-b458-33df8c4fd4ec\") " Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.031245 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxnts\" (UniqueName: \"kubernetes.io/projected/e4381525-a993-4f94-8f82-7ce47ca8e67e-kube-api-access-jxnts\") pod \"keystone-db-create-x6dd6\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.031315 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4381525-a993-4f94-8f82-7ce47ca8e67e-operator-scripts\") pod \"keystone-db-create-x6dd6\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.031493 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run" (OuterVolumeSpecName: "var-run") pod "7ec8dec9-e895-45fb-b458-33df8c4fd4ec" (UID: "7ec8dec9-e895-45fb-b458-33df8c4fd4ec"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.031545 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7ec8dec9-e895-45fb-b458-33df8c4fd4ec" (UID: "7ec8dec9-e895-45fb-b458-33df8c4fd4ec"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.032848 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-scripts" (OuterVolumeSpecName: "scripts") pod "7ec8dec9-e895-45fb-b458-33df8c4fd4ec" (UID: "7ec8dec9-e895-45fb-b458-33df8c4fd4ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.034334 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7ec8dec9-e895-45fb-b458-33df8c4fd4ec" (UID: "7ec8dec9-e895-45fb-b458-33df8c4fd4ec"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.035345 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7ec8dec9-e895-45fb-b458-33df8c4fd4ec" (UID: "7ec8dec9-e895-45fb-b458-33df8c4fd4ec"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.044183 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-kube-api-access-dkdgc" (OuterVolumeSpecName: "kube-api-access-dkdgc") pod "7ec8dec9-e895-45fb-b458-33df8c4fd4ec" (UID: "7ec8dec9-e895-45fb-b458-33df8c4fd4ec"). InnerVolumeSpecName "kube-api-access-dkdgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.051231 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4bcd-account-create-update-fxvct"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.053197 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.058204 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.068989 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4bcd-account-create-update-fxvct"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135078 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4381525-a993-4f94-8f82-7ce47ca8e67e-operator-scripts\") pod \"keystone-db-create-x6dd6\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135259 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxnts\" (UniqueName: \"kubernetes.io/projected/e4381525-a993-4f94-8f82-7ce47ca8e67e-kube-api-access-jxnts\") pod \"keystone-db-create-x6dd6\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135330 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135341 4725 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135351 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkdgc\" (UniqueName: \"kubernetes.io/projected/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-kube-api-access-dkdgc\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135361 4725 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135371 4725 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.135379 4725 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec8dec9-e895-45fb-b458-33df8c4fd4ec-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.136425 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4381525-a993-4f94-8f82-7ce47ca8e67e-operator-scripts\") pod \"keystone-db-create-x6dd6\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.164812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxnts\" (UniqueName: \"kubernetes.io/projected/e4381525-a993-4f94-8f82-7ce47ca8e67e-kube-api-access-jxnts\") pod \"keystone-db-create-x6dd6\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.237183 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fa3398-4cf5-4247-b31d-f08de7692fa2-operator-scripts\") pod \"keystone-4bcd-account-create-update-fxvct\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.237278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn4hm\" (UniqueName: \"kubernetes.io/projected/60fa3398-4cf5-4247-b31d-f08de7692fa2-kube-api-access-jn4hm\") pod \"keystone-4bcd-account-create-update-fxvct\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.255032 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xj5xl"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.256157 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.268660 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xj5xl"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.270568 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.332871 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d421-account-create-update-4kxss"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.334165 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.338155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fa3398-4cf5-4247-b31d-f08de7692fa2-operator-scripts\") pod \"keystone-4bcd-account-create-update-fxvct\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.338254 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn4hm\" (UniqueName: \"kubernetes.io/projected/60fa3398-4cf5-4247-b31d-f08de7692fa2-kube-api-access-jn4hm\") pod \"keystone-4bcd-account-create-update-fxvct\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.339501 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fa3398-4cf5-4247-b31d-f08de7692fa2-operator-scripts\") pod \"keystone-4bcd-account-create-update-fxvct\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.339627 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.341233 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d421-account-create-update-4kxss"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.384048 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn4hm\" (UniqueName: \"kubernetes.io/projected/60fa3398-4cf5-4247-b31d-f08de7692fa2-kube-api-access-jn4hm\") pod \"keystone-4bcd-account-create-update-fxvct\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.400263 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.439797 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwf5j\" (UniqueName: \"kubernetes.io/projected/7f9dd596-cbe0-4c2f-9024-e4724af56387-kube-api-access-bwf5j\") pod \"placement-d421-account-create-update-4kxss\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.440139 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de12797-1e77-407c-a08f-52ae3855f836-operator-scripts\") pod \"placement-db-create-xj5xl\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.440160 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2nnm\" (UniqueName: \"kubernetes.io/projected/2de12797-1e77-407c-a08f-52ae3855f836-kube-api-access-x2nnm\") pod \"placement-db-create-xj5xl\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.440201 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9dd596-cbe0-4c2f-9024-e4724af56387-operator-scripts\") pod \"placement-d421-account-create-update-4kxss\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.499613 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6kvbc-config-blgk2" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.504645 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6kvbc-config-blgk2" event={"ID":"7ec8dec9-e895-45fb-b458-33df8c4fd4ec","Type":"ContainerDied","Data":"a5583cb045512e39029404f9a06c62392959e5766b5c8a4419037f0cb4a2ec03"} Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.504685 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5583cb045512e39029404f9a06c62392959e5766b5c8a4419037f0cb4a2ec03" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.524845 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6kvbc-config-blgk2"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.538506 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6kvbc-config-blgk2"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.541700 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwf5j\" (UniqueName: \"kubernetes.io/projected/7f9dd596-cbe0-4c2f-9024-e4724af56387-kube-api-access-bwf5j\") pod \"placement-d421-account-create-update-4kxss\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.541815 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de12797-1e77-407c-a08f-52ae3855f836-operator-scripts\") pod \"placement-db-create-xj5xl\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.541854 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2nnm\" (UniqueName: \"kubernetes.io/projected/2de12797-1e77-407c-a08f-52ae3855f836-kube-api-access-x2nnm\") pod \"placement-db-create-xj5xl\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.541951 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9dd596-cbe0-4c2f-9024-e4724af56387-operator-scripts\") pod \"placement-d421-account-create-update-4kxss\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.542750 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9dd596-cbe0-4c2f-9024-e4724af56387-operator-scripts\") pod \"placement-d421-account-create-update-4kxss\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.542877 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de12797-1e77-407c-a08f-52ae3855f836-operator-scripts\") pod \"placement-db-create-xj5xl\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.575632 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwf5j\" (UniqueName: \"kubernetes.io/projected/7f9dd596-cbe0-4c2f-9024-e4724af56387-kube-api-access-bwf5j\") pod \"placement-d421-account-create-update-4kxss\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.577808 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2nnm\" (UniqueName: \"kubernetes.io/projected/2de12797-1e77-407c-a08f-52ae3855f836-kube-api-access-x2nnm\") pod \"placement-db-create-xj5xl\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.586652 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.660860 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.725147 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e1ec-account-create-update-qv24h"] Feb 27 06:30:35 crc kubenswrapper[4725]: W0227 06:30:35.735894 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06bb37bd_657c_48b6_9ed9_7039b6e7211f.slice/crio-f3dad90d45b6b090ade75ac6d696e20d2d2a659dcf3ab99cd0485313dd44416d WatchSource:0}: Error finding container f3dad90d45b6b090ade75ac6d696e20d2d2a659dcf3ab99cd0485313dd44416d: Status 404 returned error can't find the container with id f3dad90d45b6b090ade75ac6d696e20d2d2a659dcf3ab99cd0485313dd44416d Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.755924 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ssnn8"] Feb 27 06:30:35 crc kubenswrapper[4725]: I0227 06:30:35.996174 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-x6dd6"] Feb 27 06:30:36 crc kubenswrapper[4725]: W0227 06:30:36.007507 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4381525_a993_4f94_8f82_7ce47ca8e67e.slice/crio-50fa64d0e64cb42e5b80483e3b1b514d3eb0dbbf74aae3a77cd7009216ae2e3a WatchSource:0}: Error finding container 50fa64d0e64cb42e5b80483e3b1b514d3eb0dbbf74aae3a77cd7009216ae2e3a: Status 404 returned error can't find the container with id 50fa64d0e64cb42e5b80483e3b1b514d3eb0dbbf74aae3a77cd7009216ae2e3a Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.114356 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4bcd-account-create-update-fxvct"] Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.125662 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-s9fpl"] Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.136191 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.140371 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-a42a-account-create-update-cc5w5"] Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.141353 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.143493 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.152797 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-s9fpl"] Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.178570 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a42a-account-create-update-cc5w5"] Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.260946 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-operator-scripts\") pod \"watcher-a42a-account-create-update-cc5w5\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.261014 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5255cba1-2ca7-460a-b112-28aa45156734-operator-scripts\") pod \"watcher-db-create-s9fpl\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.261060 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdwdc\" (UniqueName: \"kubernetes.io/projected/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-kube-api-access-hdwdc\") pod \"watcher-a42a-account-create-update-cc5w5\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.261088 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn84l\" (UniqueName: \"kubernetes.io/projected/5255cba1-2ca7-460a-b112-28aa45156734-kube-api-access-pn84l\") pod \"watcher-db-create-s9fpl\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.269960 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec8dec9-e895-45fb-b458-33df8c4fd4ec" path="/var/lib/kubelet/pods/7ec8dec9-e895-45fb-b458-33df8c4fd4ec/volumes" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.270810 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xj5xl"] Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.276063 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d421-account-create-update-4kxss"] Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.362824 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdwdc\" (UniqueName: \"kubernetes.io/projected/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-kube-api-access-hdwdc\") pod \"watcher-a42a-account-create-update-cc5w5\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.362866 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn84l\" (UniqueName: \"kubernetes.io/projected/5255cba1-2ca7-460a-b112-28aa45156734-kube-api-access-pn84l\") pod \"watcher-db-create-s9fpl\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.363014 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-operator-scripts\") pod \"watcher-a42a-account-create-update-cc5w5\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.363045 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5255cba1-2ca7-460a-b112-28aa45156734-operator-scripts\") pod \"watcher-db-create-s9fpl\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.363961 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-operator-scripts\") pod \"watcher-a42a-account-create-update-cc5w5\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.363962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5255cba1-2ca7-460a-b112-28aa45156734-operator-scripts\") pod \"watcher-db-create-s9fpl\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.386768 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdwdc\" (UniqueName: \"kubernetes.io/projected/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-kube-api-access-hdwdc\") pod \"watcher-a42a-account-create-update-cc5w5\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.392638 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn84l\" (UniqueName: \"kubernetes.io/projected/5255cba1-2ca7-460a-b112-28aa45156734-kube-api-access-pn84l\") pod \"watcher-db-create-s9fpl\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.476866 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.498019 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.519934 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xj5xl" event={"ID":"2de12797-1e77-407c-a08f-52ae3855f836","Type":"ContainerStarted","Data":"aa0da31d84c32f8300d73b512a897ddf676a53cbe50b4a12a60296f956535ed3"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.519981 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xj5xl" event={"ID":"2de12797-1e77-407c-a08f-52ae3855f836","Type":"ContainerStarted","Data":"d7bbee91fbac02919d38fdca4240e4eb82fc538b3a6d3b304ada3e6ec07e155a"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.531547 4725 generic.go:334] "Generic (PLEG): container finished" podID="06bb37bd-657c-48b6-9ed9-7039b6e7211f" containerID="7e07b91b2eb6e5b943fa682856292559ea3dd492925584930d492475ca4fb8ef" exitCode=0 Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.531703 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e1ec-account-create-update-qv24h" event={"ID":"06bb37bd-657c-48b6-9ed9-7039b6e7211f","Type":"ContainerDied","Data":"7e07b91b2eb6e5b943fa682856292559ea3dd492925584930d492475ca4fb8ef"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.531728 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e1ec-account-create-update-qv24h" event={"ID":"06bb37bd-657c-48b6-9ed9-7039b6e7211f","Type":"ContainerStarted","Data":"f3dad90d45b6b090ade75ac6d696e20d2d2a659dcf3ab99cd0485313dd44416d"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.539216 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-xj5xl" podStartSLOduration=1.539204334 podStartE2EDuration="1.539204334s" podCreationTimestamp="2026-02-27 06:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:36.532613128 +0000 UTC m=+1214.995233697" watchObservedRunningTime="2026-02-27 06:30:36.539204334 +0000 UTC m=+1215.001824903" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.543963 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bq24l" event={"ID":"40a2ae59-8725-42be-984a-739a82d476c5","Type":"ContainerStarted","Data":"3dd7af17451f066d14837c43b1002f8a5bf89ffb7082849f81e3d22e7a0b4981"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.547763 4725 generic.go:334] "Generic (PLEG): container finished" podID="5a31f27e-dadb-461c-a614-77cc108a550f" containerID="dc22c3ac827db7656bbb3e966573510d429daac993b5244e894dd42018760a1e" exitCode=0 Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.547821 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ssnn8" event={"ID":"5a31f27e-dadb-461c-a614-77cc108a550f","Type":"ContainerDied","Data":"dc22c3ac827db7656bbb3e966573510d429daac993b5244e894dd42018760a1e"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.547843 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ssnn8" event={"ID":"5a31f27e-dadb-461c-a614-77cc108a550f","Type":"ContainerStarted","Data":"5afa7da7910a91dd4f4c890354051571b6bd3028d79ca545b788e11bd6acd012"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.549507 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d421-account-create-update-4kxss" event={"ID":"7f9dd596-cbe0-4c2f-9024-e4724af56387","Type":"ContainerStarted","Data":"d0c135272a567fbf7a4905407edf1c7d047f92a605da1cf0b76eb7cec17e838a"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.549544 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d421-account-create-update-4kxss" event={"ID":"7f9dd596-cbe0-4c2f-9024-e4724af56387","Type":"ContainerStarted","Data":"a1edf91af7fd3731df7ce75fb6574bef85a699abda6bc93c94e7664d6675f371"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.554243 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4bcd-account-create-update-fxvct" event={"ID":"60fa3398-4cf5-4247-b31d-f08de7692fa2","Type":"ContainerStarted","Data":"bbf3f84458346ea8d67869d214c8a8e33f58e11e9c444df242e31b064707c2da"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.554310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4bcd-account-create-update-fxvct" event={"ID":"60fa3398-4cf5-4247-b31d-f08de7692fa2","Type":"ContainerStarted","Data":"0b3b19cc00479e4e2bc3823d1a6e78e95e10fadaf4bab710a677d3c811c45d5f"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.559541 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x6dd6" event={"ID":"e4381525-a993-4f94-8f82-7ce47ca8e67e","Type":"ContainerStarted","Data":"40639ff8b1fe06a901d046c91fbd26ae36baaa256aa83ee36f6663df215bcf9a"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.559588 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x6dd6" event={"ID":"e4381525-a993-4f94-8f82-7ce47ca8e67e","Type":"ContainerStarted","Data":"50fa64d0e64cb42e5b80483e3b1b514d3eb0dbbf74aae3a77cd7009216ae2e3a"} Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.575645 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-bq24l" podStartSLOduration=2.266742796 podStartE2EDuration="25.575626977s" podCreationTimestamp="2026-02-27 06:30:11 +0000 UTC" firstStartedPulling="2026-02-27 06:30:12.060757282 +0000 UTC m=+1190.523377851" lastFinishedPulling="2026-02-27 06:30:35.369641463 +0000 UTC m=+1213.832262032" observedRunningTime="2026-02-27 06:30:36.571780319 +0000 UTC m=+1215.034400898" watchObservedRunningTime="2026-02-27 06:30:36.575626977 +0000 UTC m=+1215.038247546" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.598940 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d421-account-create-update-4kxss" podStartSLOduration=1.598922722 podStartE2EDuration="1.598922722s" podCreationTimestamp="2026-02-27 06:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:36.592940574 +0000 UTC m=+1215.055561153" watchObservedRunningTime="2026-02-27 06:30:36.598922722 +0000 UTC m=+1215.061543291" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.620814 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-x6dd6" podStartSLOduration=2.620799547 podStartE2EDuration="2.620799547s" podCreationTimestamp="2026-02-27 06:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:36.618318258 +0000 UTC m=+1215.080938847" watchObservedRunningTime="2026-02-27 06:30:36.620799547 +0000 UTC m=+1215.083420116" Feb 27 06:30:36 crc kubenswrapper[4725]: I0227 06:30:36.654441 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-4bcd-account-create-update-fxvct" podStartSLOduration=1.654422173 podStartE2EDuration="1.654422173s" podCreationTimestamp="2026-02-27 06:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:36.653844036 +0000 UTC m=+1215.116464605" watchObservedRunningTime="2026-02-27 06:30:36.654422173 +0000 UTC m=+1215.117042742" Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.006542 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-s9fpl"] Feb 27 06:30:37 crc kubenswrapper[4725]: W0227 06:30:37.008509 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5255cba1_2ca7_460a_b112_28aa45156734.slice/crio-5707d17b8e6661d0376b719a416f9077f3331bffe178efabd20546d617176e71 WatchSource:0}: Error finding container 5707d17b8e6661d0376b719a416f9077f3331bffe178efabd20546d617176e71: Status 404 returned error can't find the container with id 5707d17b8e6661d0376b719a416f9077f3331bffe178efabd20546d617176e71 Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.087010 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a42a-account-create-update-cc5w5"] Feb 27 06:30:37 crc kubenswrapper[4725]: W0227 06:30:37.093103 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44334cb8_8e0a_4fb3_976e_b140f4c4f79b.slice/crio-0de5acdee11389f2433244430598854817e3c5641f42eb3fbb3deed57ced9d8a WatchSource:0}: Error finding container 0de5acdee11389f2433244430598854817e3c5641f42eb3fbb3deed57ced9d8a: Status 404 returned error can't find the container with id 0de5acdee11389f2433244430598854817e3c5641f42eb3fbb3deed57ced9d8a Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.566848 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42a-account-create-update-cc5w5" event={"ID":"44334cb8-8e0a-4fb3-976e-b140f4c4f79b","Type":"ContainerStarted","Data":"9ac8cd03bc4170645b1e7db4df6a722aa4ff22d813c9df2e59edb9dd8dc28e01"} Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.567166 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42a-account-create-update-cc5w5" event={"ID":"44334cb8-8e0a-4fb3-976e-b140f4c4f79b","Type":"ContainerStarted","Data":"0de5acdee11389f2433244430598854817e3c5641f42eb3fbb3deed57ced9d8a"} Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.568150 4725 generic.go:334] "Generic (PLEG): container finished" podID="e4381525-a993-4f94-8f82-7ce47ca8e67e" containerID="40639ff8b1fe06a901d046c91fbd26ae36baaa256aa83ee36f6663df215bcf9a" exitCode=0 Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.568192 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x6dd6" event={"ID":"e4381525-a993-4f94-8f82-7ce47ca8e67e","Type":"ContainerDied","Data":"40639ff8b1fe06a901d046c91fbd26ae36baaa256aa83ee36f6663df215bcf9a"} Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.570806 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-s9fpl" event={"ID":"5255cba1-2ca7-460a-b112-28aa45156734","Type":"ContainerStarted","Data":"52516dca89d290292980c8f2a21d7017c54e8c82475cf671ae9de2fe70912d42"} Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.570838 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-s9fpl" event={"ID":"5255cba1-2ca7-460a-b112-28aa45156734","Type":"ContainerStarted","Data":"5707d17b8e6661d0376b719a416f9077f3331bffe178efabd20546d617176e71"} Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.589986 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-a42a-account-create-update-cc5w5" podStartSLOduration=1.5899681430000001 podStartE2EDuration="1.589968143s" podCreationTimestamp="2026-02-27 06:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:37.586875256 +0000 UTC m=+1216.049495845" watchObservedRunningTime="2026-02-27 06:30:37.589968143 +0000 UTC m=+1216.052588712" Feb 27 06:30:37 crc kubenswrapper[4725]: I0227 06:30:37.628549 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-s9fpl" podStartSLOduration=1.628530137 podStartE2EDuration="1.628530137s" podCreationTimestamp="2026-02-27 06:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:37.627983582 +0000 UTC m=+1216.090604161" watchObservedRunningTime="2026-02-27 06:30:37.628530137 +0000 UTC m=+1216.091150716" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.136922 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.143249 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.198101 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a31f27e-dadb-461c-a614-77cc108a550f-operator-scripts\") pod \"5a31f27e-dadb-461c-a614-77cc108a550f\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.198201 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsqt8\" (UniqueName: \"kubernetes.io/projected/5a31f27e-dadb-461c-a614-77cc108a550f-kube-api-access-wsqt8\") pod \"5a31f27e-dadb-461c-a614-77cc108a550f\" (UID: \"5a31f27e-dadb-461c-a614-77cc108a550f\") " Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.199030 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a31f27e-dadb-461c-a614-77cc108a550f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a31f27e-dadb-461c-a614-77cc108a550f" (UID: "5a31f27e-dadb-461c-a614-77cc108a550f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.225457 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a31f27e-dadb-461c-a614-77cc108a550f-kube-api-access-wsqt8" (OuterVolumeSpecName: "kube-api-access-wsqt8") pod "5a31f27e-dadb-461c-a614-77cc108a550f" (UID: "5a31f27e-dadb-461c-a614-77cc108a550f"). InnerVolumeSpecName "kube-api-access-wsqt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.299275 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bb37bd-657c-48b6-9ed9-7039b6e7211f-operator-scripts\") pod \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.299379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7b9t\" (UniqueName: \"kubernetes.io/projected/06bb37bd-657c-48b6-9ed9-7039b6e7211f-kube-api-access-r7b9t\") pod \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\" (UID: \"06bb37bd-657c-48b6-9ed9-7039b6e7211f\") " Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.299874 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a31f27e-dadb-461c-a614-77cc108a550f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.299889 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsqt8\" (UniqueName: \"kubernetes.io/projected/5a31f27e-dadb-461c-a614-77cc108a550f-kube-api-access-wsqt8\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.300476 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06bb37bd-657c-48b6-9ed9-7039b6e7211f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06bb37bd-657c-48b6-9ed9-7039b6e7211f" (UID: "06bb37bd-657c-48b6-9ed9-7039b6e7211f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.303882 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06bb37bd-657c-48b6-9ed9-7039b6e7211f-kube-api-access-r7b9t" (OuterVolumeSpecName: "kube-api-access-r7b9t") pod "06bb37bd-657c-48b6-9ed9-7039b6e7211f" (UID: "06bb37bd-657c-48b6-9ed9-7039b6e7211f"). InnerVolumeSpecName "kube-api-access-r7b9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.401646 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bb37bd-657c-48b6-9ed9-7039b6e7211f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.401677 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7b9t\" (UniqueName: \"kubernetes.io/projected/06bb37bd-657c-48b6-9ed9-7039b6e7211f-kube-api-access-r7b9t\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.582449 4725 generic.go:334] "Generic (PLEG): container finished" podID="44334cb8-8e0a-4fb3-976e-b140f4c4f79b" containerID="9ac8cd03bc4170645b1e7db4df6a722aa4ff22d813c9df2e59edb9dd8dc28e01" exitCode=0 Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.582493 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42a-account-create-update-cc5w5" event={"ID":"44334cb8-8e0a-4fb3-976e-b140f4c4f79b","Type":"ContainerDied","Data":"9ac8cd03bc4170645b1e7db4df6a722aa4ff22d813c9df2e59edb9dd8dc28e01"} Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.583846 4725 generic.go:334] "Generic (PLEG): container finished" podID="2de12797-1e77-407c-a08f-52ae3855f836" containerID="aa0da31d84c32f8300d73b512a897ddf676a53cbe50b4a12a60296f956535ed3" exitCode=0 Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.583950 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xj5xl" event={"ID":"2de12797-1e77-407c-a08f-52ae3855f836","Type":"ContainerDied","Data":"aa0da31d84c32f8300d73b512a897ddf676a53cbe50b4a12a60296f956535ed3"} Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.586485 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e1ec-account-create-update-qv24h" event={"ID":"06bb37bd-657c-48b6-9ed9-7039b6e7211f","Type":"ContainerDied","Data":"f3dad90d45b6b090ade75ac6d696e20d2d2a659dcf3ab99cd0485313dd44416d"} Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.586518 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3dad90d45b6b090ade75ac6d696e20d2d2a659dcf3ab99cd0485313dd44416d" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.586497 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e1ec-account-create-update-qv24h" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.588155 4725 generic.go:334] "Generic (PLEG): container finished" podID="5255cba1-2ca7-460a-b112-28aa45156734" containerID="52516dca89d290292980c8f2a21d7017c54e8c82475cf671ae9de2fe70912d42" exitCode=0 Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.588218 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-s9fpl" event={"ID":"5255cba1-2ca7-460a-b112-28aa45156734","Type":"ContainerDied","Data":"52516dca89d290292980c8f2a21d7017c54e8c82475cf671ae9de2fe70912d42"} Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.590255 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ssnn8" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.590258 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ssnn8" event={"ID":"5a31f27e-dadb-461c-a614-77cc108a550f","Type":"ContainerDied","Data":"5afa7da7910a91dd4f4c890354051571b6bd3028d79ca545b788e11bd6acd012"} Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.590318 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5afa7da7910a91dd4f4c890354051571b6bd3028d79ca545b788e11bd6acd012" Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.592490 4725 generic.go:334] "Generic (PLEG): container finished" podID="7f9dd596-cbe0-4c2f-9024-e4724af56387" containerID="d0c135272a567fbf7a4905407edf1c7d047f92a605da1cf0b76eb7cec17e838a" exitCode=0 Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.592540 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d421-account-create-update-4kxss" event={"ID":"7f9dd596-cbe0-4c2f-9024-e4724af56387","Type":"ContainerDied","Data":"d0c135272a567fbf7a4905407edf1c7d047f92a605da1cf0b76eb7cec17e838a"} Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.596274 4725 generic.go:334] "Generic (PLEG): container finished" podID="60fa3398-4cf5-4247-b31d-f08de7692fa2" containerID="bbf3f84458346ea8d67869d214c8a8e33f58e11e9c444df242e31b064707c2da" exitCode=0 Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.596325 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4bcd-account-create-update-fxvct" event={"ID":"60fa3398-4cf5-4247-b31d-f08de7692fa2","Type":"ContainerDied","Data":"bbf3f84458346ea8d67869d214c8a8e33f58e11e9c444df242e31b064707c2da"} Feb 27 06:30:38 crc kubenswrapper[4725]: I0227 06:30:38.916763 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.015023 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4381525-a993-4f94-8f82-7ce47ca8e67e-operator-scripts\") pod \"e4381525-a993-4f94-8f82-7ce47ca8e67e\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.015111 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxnts\" (UniqueName: \"kubernetes.io/projected/e4381525-a993-4f94-8f82-7ce47ca8e67e-kube-api-access-jxnts\") pod \"e4381525-a993-4f94-8f82-7ce47ca8e67e\" (UID: \"e4381525-a993-4f94-8f82-7ce47ca8e67e\") " Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.015431 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4381525-a993-4f94-8f82-7ce47ca8e67e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4381525-a993-4f94-8f82-7ce47ca8e67e" (UID: "e4381525-a993-4f94-8f82-7ce47ca8e67e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.015555 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4381525-a993-4f94-8f82-7ce47ca8e67e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.019354 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4381525-a993-4f94-8f82-7ce47ca8e67e-kube-api-access-jxnts" (OuterVolumeSpecName: "kube-api-access-jxnts") pod "e4381525-a993-4f94-8f82-7ce47ca8e67e" (UID: "e4381525-a993-4f94-8f82-7ce47ca8e67e"). InnerVolumeSpecName "kube-api-access-jxnts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.139155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.139449 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxnts\" (UniqueName: \"kubernetes.io/projected/e4381525-a993-4f94-8f82-7ce47ca8e67e-kube-api-access-jxnts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:39 crc kubenswrapper[4725]: E0227 06:30:39.139530 4725 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 06:30:39 crc kubenswrapper[4725]: E0227 06:30:39.139567 4725 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 06:30:39 crc kubenswrapper[4725]: E0227 06:30:39.139624 4725 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift podName:872eba69-b1d2-4028-b65f-b70fa14daeb0 nodeName:}" failed. No retries permitted until 2026-02-27 06:31:11.139607969 +0000 UTC m=+1249.602228538 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift") pod "swift-storage-0" (UID: "872eba69-b1d2-4028-b65f-b70fa14daeb0") : configmap "swift-ring-files" not found Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.551786 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6kvbc" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.566062 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-98cpk"] Feb 27 06:30:39 crc kubenswrapper[4725]: E0227 06:30:39.566465 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a31f27e-dadb-461c-a614-77cc108a550f" containerName="mariadb-database-create" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.566481 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a31f27e-dadb-461c-a614-77cc108a550f" containerName="mariadb-database-create" Feb 27 06:30:39 crc kubenswrapper[4725]: E0227 06:30:39.566512 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4381525-a993-4f94-8f82-7ce47ca8e67e" containerName="mariadb-database-create" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.566520 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4381525-a993-4f94-8f82-7ce47ca8e67e" containerName="mariadb-database-create" Feb 27 06:30:39 crc kubenswrapper[4725]: E0227 06:30:39.566540 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06bb37bd-657c-48b6-9ed9-7039b6e7211f" containerName="mariadb-account-create-update" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.566556 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="06bb37bd-657c-48b6-9ed9-7039b6e7211f" containerName="mariadb-account-create-update" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.566786 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="06bb37bd-657c-48b6-9ed9-7039b6e7211f" containerName="mariadb-account-create-update" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.566802 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a31f27e-dadb-461c-a614-77cc108a550f" containerName="mariadb-database-create" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.566822 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4381525-a993-4f94-8f82-7ce47ca8e67e" containerName="mariadb-database-create" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.567437 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.571991 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mg9ls" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.572171 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.577245 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-98cpk"] Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.622599 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x6dd6" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.630011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x6dd6" event={"ID":"e4381525-a993-4f94-8f82-7ce47ca8e67e","Type":"ContainerDied","Data":"50fa64d0e64cb42e5b80483e3b1b514d3eb0dbbf74aae3a77cd7009216ae2e3a"} Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.630050 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50fa64d0e64cb42e5b80483e3b1b514d3eb0dbbf74aae3a77cd7009216ae2e3a" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.648903 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhsz\" (UniqueName: \"kubernetes.io/projected/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-kube-api-access-sbhsz\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.648950 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-config-data\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.648994 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-combined-ca-bundle\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.649027 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-db-sync-config-data\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.750268 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbhsz\" (UniqueName: \"kubernetes.io/projected/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-kube-api-access-sbhsz\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.750622 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-config-data\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.750675 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-combined-ca-bundle\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.750714 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-db-sync-config-data\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.754256 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-db-sync-config-data\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.764075 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-combined-ca-bundle\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.773408 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbhsz\" (UniqueName: \"kubernetes.io/projected/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-kube-api-access-sbhsz\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.777840 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-config-data\") pod \"glance-db-sync-98cpk\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:39 crc kubenswrapper[4725]: I0227 06:30:39.902514 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98cpk" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.017538 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.161798 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdwdc\" (UniqueName: \"kubernetes.io/projected/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-kube-api-access-hdwdc\") pod \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.162102 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-operator-scripts\") pod \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\" (UID: \"44334cb8-8e0a-4fb3-976e-b140f4c4f79b\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.163338 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44334cb8-8e0a-4fb3-976e-b140f4c4f79b" (UID: "44334cb8-8e0a-4fb3-976e-b140f4c4f79b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.173401 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-kube-api-access-hdwdc" (OuterVolumeSpecName: "kube-api-access-hdwdc") pod "44334cb8-8e0a-4fb3-976e-b140f4c4f79b" (UID: "44334cb8-8e0a-4fb3-976e-b140f4c4f79b"). InnerVolumeSpecName "kube-api-access-hdwdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.216107 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.265998 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdwdc\" (UniqueName: \"kubernetes.io/projected/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-kube-api-access-hdwdc\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.266030 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44334cb8-8e0a-4fb3-976e-b140f4c4f79b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.330574 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.366622 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwf5j\" (UniqueName: \"kubernetes.io/projected/7f9dd596-cbe0-4c2f-9024-e4724af56387-kube-api-access-bwf5j\") pod \"7f9dd596-cbe0-4c2f-9024-e4724af56387\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.366738 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9dd596-cbe0-4c2f-9024-e4724af56387-operator-scripts\") pod \"7f9dd596-cbe0-4c2f-9024-e4724af56387\" (UID: \"7f9dd596-cbe0-4c2f-9024-e4724af56387\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.367874 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f9dd596-cbe0-4c2f-9024-e4724af56387-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f9dd596-cbe0-4c2f-9024-e4724af56387" (UID: "7f9dd596-cbe0-4c2f-9024-e4724af56387"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.373521 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9dd596-cbe0-4c2f-9024-e4724af56387-kube-api-access-bwf5j" (OuterVolumeSpecName: "kube-api-access-bwf5j") pod "7f9dd596-cbe0-4c2f-9024-e4724af56387" (UID: "7f9dd596-cbe0-4c2f-9024-e4724af56387"). InnerVolumeSpecName "kube-api-access-bwf5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.377119 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.416619 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.467614 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fa3398-4cf5-4247-b31d-f08de7692fa2-operator-scripts\") pod \"60fa3398-4cf5-4247-b31d-f08de7692fa2\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.467799 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de12797-1e77-407c-a08f-52ae3855f836-operator-scripts\") pod \"2de12797-1e77-407c-a08f-52ae3855f836\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.467833 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2nnm\" (UniqueName: \"kubernetes.io/projected/2de12797-1e77-407c-a08f-52ae3855f836-kube-api-access-x2nnm\") pod \"2de12797-1e77-407c-a08f-52ae3855f836\" (UID: \"2de12797-1e77-407c-a08f-52ae3855f836\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.467892 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn4hm\" (UniqueName: \"kubernetes.io/projected/60fa3398-4cf5-4247-b31d-f08de7692fa2-kube-api-access-jn4hm\") pod \"60fa3398-4cf5-4247-b31d-f08de7692fa2\" (UID: \"60fa3398-4cf5-4247-b31d-f08de7692fa2\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.468201 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwf5j\" (UniqueName: \"kubernetes.io/projected/7f9dd596-cbe0-4c2f-9024-e4724af56387-kube-api-access-bwf5j\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.468258 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9dd596-cbe0-4c2f-9024-e4724af56387-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.468342 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de12797-1e77-407c-a08f-52ae3855f836-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2de12797-1e77-407c-a08f-52ae3855f836" (UID: "2de12797-1e77-407c-a08f-52ae3855f836"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.471632 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fa3398-4cf5-4247-b31d-f08de7692fa2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60fa3398-4cf5-4247-b31d-f08de7692fa2" (UID: "60fa3398-4cf5-4247-b31d-f08de7692fa2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.472196 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de12797-1e77-407c-a08f-52ae3855f836-kube-api-access-x2nnm" (OuterVolumeSpecName: "kube-api-access-x2nnm") pod "2de12797-1e77-407c-a08f-52ae3855f836" (UID: "2de12797-1e77-407c-a08f-52ae3855f836"). InnerVolumeSpecName "kube-api-access-x2nnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.472462 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fa3398-4cf5-4247-b31d-f08de7692fa2-kube-api-access-jn4hm" (OuterVolumeSpecName: "kube-api-access-jn4hm") pod "60fa3398-4cf5-4247-b31d-f08de7692fa2" (UID: "60fa3398-4cf5-4247-b31d-f08de7692fa2"). InnerVolumeSpecName "kube-api-access-jn4hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.569501 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5255cba1-2ca7-460a-b112-28aa45156734-operator-scripts\") pod \"5255cba1-2ca7-460a-b112-28aa45156734\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.569734 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn84l\" (UniqueName: \"kubernetes.io/projected/5255cba1-2ca7-460a-b112-28aa45156734-kube-api-access-pn84l\") pod \"5255cba1-2ca7-460a-b112-28aa45156734\" (UID: \"5255cba1-2ca7-460a-b112-28aa45156734\") " Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.570111 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de12797-1e77-407c-a08f-52ae3855f836-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.570129 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2nnm\" (UniqueName: \"kubernetes.io/projected/2de12797-1e77-407c-a08f-52ae3855f836-kube-api-access-x2nnm\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.570141 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn4hm\" (UniqueName: \"kubernetes.io/projected/60fa3398-4cf5-4247-b31d-f08de7692fa2-kube-api-access-jn4hm\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.570149 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fa3398-4cf5-4247-b31d-f08de7692fa2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.570994 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5255cba1-2ca7-460a-b112-28aa45156734-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5255cba1-2ca7-460a-b112-28aa45156734" (UID: "5255cba1-2ca7-460a-b112-28aa45156734"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.573279 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5255cba1-2ca7-460a-b112-28aa45156734-kube-api-access-pn84l" (OuterVolumeSpecName: "kube-api-access-pn84l") pod "5255cba1-2ca7-460a-b112-28aa45156734" (UID: "5255cba1-2ca7-460a-b112-28aa45156734"). InnerVolumeSpecName "kube-api-access-pn84l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.629301 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xj5xl" event={"ID":"2de12797-1e77-407c-a08f-52ae3855f836","Type":"ContainerDied","Data":"d7bbee91fbac02919d38fdca4240e4eb82fc538b3a6d3b304ada3e6ec07e155a"} Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.629337 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7bbee91fbac02919d38fdca4240e4eb82fc538b3a6d3b304ada3e6ec07e155a" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.629381 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xj5xl" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.630772 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-s9fpl" event={"ID":"5255cba1-2ca7-460a-b112-28aa45156734","Type":"ContainerDied","Data":"5707d17b8e6661d0376b719a416f9077f3331bffe178efabd20546d617176e71"} Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.630794 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5707d17b8e6661d0376b719a416f9077f3331bffe178efabd20546d617176e71" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.630845 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-s9fpl" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.642001 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d421-account-create-update-4kxss" event={"ID":"7f9dd596-cbe0-4c2f-9024-e4724af56387","Type":"ContainerDied","Data":"a1edf91af7fd3731df7ce75fb6574bef85a699abda6bc93c94e7664d6675f371"} Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.642039 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1edf91af7fd3731df7ce75fb6574bef85a699abda6bc93c94e7664d6675f371" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.642099 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d421-account-create-update-4kxss" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.645430 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4bcd-account-create-update-fxvct" event={"ID":"60fa3398-4cf5-4247-b31d-f08de7692fa2","Type":"ContainerDied","Data":"0b3b19cc00479e4e2bc3823d1a6e78e95e10fadaf4bab710a677d3c811c45d5f"} Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.645465 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3b19cc00479e4e2bc3823d1a6e78e95e10fadaf4bab710a677d3c811c45d5f" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.645516 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4bcd-account-create-update-fxvct" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.647759 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a42a-account-create-update-cc5w5" event={"ID":"44334cb8-8e0a-4fb3-976e-b140f4c4f79b","Type":"ContainerDied","Data":"0de5acdee11389f2433244430598854817e3c5641f42eb3fbb3deed57ced9d8a"} Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.647812 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de5acdee11389f2433244430598854817e3c5641f42eb3fbb3deed57ced9d8a" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.647878 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a42a-account-create-update-cc5w5" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.653585 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-98cpk"] Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.671547 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5255cba1-2ca7-460a-b112-28aa45156734-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.671578 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn84l\" (UniqueName: \"kubernetes.io/projected/5255cba1-2ca7-460a-b112-28aa45156734-kube-api-access-pn84l\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.890426 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tttct"] Feb 27 06:30:40 crc kubenswrapper[4725]: E0227 06:30:40.890731 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44334cb8-8e0a-4fb3-976e-b140f4c4f79b" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.890745 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="44334cb8-8e0a-4fb3-976e-b140f4c4f79b" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: E0227 06:30:40.890757 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de12797-1e77-407c-a08f-52ae3855f836" containerName="mariadb-database-create" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.890763 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de12797-1e77-407c-a08f-52ae3855f836" containerName="mariadb-database-create" Feb 27 06:30:40 crc kubenswrapper[4725]: E0227 06:30:40.890775 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fa3398-4cf5-4247-b31d-f08de7692fa2" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.890782 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fa3398-4cf5-4247-b31d-f08de7692fa2" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: E0227 06:30:40.890805 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9dd596-cbe0-4c2f-9024-e4724af56387" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.890811 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9dd596-cbe0-4c2f-9024-e4724af56387" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: E0227 06:30:40.890821 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5255cba1-2ca7-460a-b112-28aa45156734" containerName="mariadb-database-create" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.890827 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5255cba1-2ca7-460a-b112-28aa45156734" containerName="mariadb-database-create" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.890994 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9dd596-cbe0-4c2f-9024-e4724af56387" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.891008 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de12797-1e77-407c-a08f-52ae3855f836" containerName="mariadb-database-create" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.891018 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="44334cb8-8e0a-4fb3-976e-b140f4c4f79b" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.891029 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5255cba1-2ca7-460a-b112-28aa45156734" containerName="mariadb-database-create" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.891038 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fa3398-4cf5-4247-b31d-f08de7692fa2" containerName="mariadb-account-create-update" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.891586 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tttct" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.893416 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.909297 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tttct"] Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.975862 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/335dcb88-5e6e-46da-b481-a9c930592195-operator-scripts\") pod \"root-account-create-update-tttct\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " pod="openstack/root-account-create-update-tttct" Feb 27 06:30:40 crc kubenswrapper[4725]: I0227 06:30:40.975921 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwxm\" (UniqueName: \"kubernetes.io/projected/335dcb88-5e6e-46da-b481-a9c930592195-kube-api-access-rbwxm\") pod \"root-account-create-update-tttct\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " pod="openstack/root-account-create-update-tttct" Feb 27 06:30:41 crc kubenswrapper[4725]: I0227 06:30:41.078187 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/335dcb88-5e6e-46da-b481-a9c930592195-operator-scripts\") pod \"root-account-create-update-tttct\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " pod="openstack/root-account-create-update-tttct" Feb 27 06:30:41 crc kubenswrapper[4725]: I0227 06:30:41.078329 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwxm\" (UniqueName: \"kubernetes.io/projected/335dcb88-5e6e-46da-b481-a9c930592195-kube-api-access-rbwxm\") pod \"root-account-create-update-tttct\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " pod="openstack/root-account-create-update-tttct" Feb 27 06:30:41 crc kubenswrapper[4725]: I0227 06:30:41.079811 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/335dcb88-5e6e-46da-b481-a9c930592195-operator-scripts\") pod \"root-account-create-update-tttct\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " pod="openstack/root-account-create-update-tttct" Feb 27 06:30:41 crc kubenswrapper[4725]: I0227 06:30:41.149540 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwxm\" (UniqueName: \"kubernetes.io/projected/335dcb88-5e6e-46da-b481-a9c930592195-kube-api-access-rbwxm\") pod \"root-account-create-update-tttct\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " pod="openstack/root-account-create-update-tttct" Feb 27 06:30:41 crc kubenswrapper[4725]: I0227 06:30:41.205662 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tttct" Feb 27 06:30:41 crc kubenswrapper[4725]: I0227 06:30:41.657445 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98cpk" event={"ID":"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6","Type":"ContainerStarted","Data":"d90ae8f9f997db791a66cb3b1457023788e90dfa980c2ca7b3fda060049df9e0"} Feb 27 06:30:41 crc kubenswrapper[4725]: I0227 06:30:41.773616 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tttct"] Feb 27 06:30:42 crc kubenswrapper[4725]: I0227 06:30:42.308469 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:42 crc kubenswrapper[4725]: I0227 06:30:42.310954 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:42 crc kubenswrapper[4725]: I0227 06:30:42.667529 4725 generic.go:334] "Generic (PLEG): container finished" podID="335dcb88-5e6e-46da-b481-a9c930592195" containerID="d5d9320113c890c37f0856d858e7bc52c906fea1717b0c40f482e7db39672ea4" exitCode=0 Feb 27 06:30:42 crc kubenswrapper[4725]: I0227 06:30:42.667639 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tttct" event={"ID":"335dcb88-5e6e-46da-b481-a9c930592195","Type":"ContainerDied","Data":"d5d9320113c890c37f0856d858e7bc52c906fea1717b0c40f482e7db39672ea4"} Feb 27 06:30:42 crc kubenswrapper[4725]: I0227 06:30:42.667934 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tttct" event={"ID":"335dcb88-5e6e-46da-b481-a9c930592195","Type":"ContainerStarted","Data":"27b07b72f023e5f1978969295027dcf2587c8d8c17acbcd28ff5d5103169b528"} Feb 27 06:30:42 crc kubenswrapper[4725]: I0227 06:30:42.669675 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.058317 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tttct" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.129015 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbwxm\" (UniqueName: \"kubernetes.io/projected/335dcb88-5e6e-46da-b481-a9c930592195-kube-api-access-rbwxm\") pod \"335dcb88-5e6e-46da-b481-a9c930592195\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.129095 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/335dcb88-5e6e-46da-b481-a9c930592195-operator-scripts\") pod \"335dcb88-5e6e-46da-b481-a9c930592195\" (UID: \"335dcb88-5e6e-46da-b481-a9c930592195\") " Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.129538 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335dcb88-5e6e-46da-b481-a9c930592195-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "335dcb88-5e6e-46da-b481-a9c930592195" (UID: "335dcb88-5e6e-46da-b481-a9c930592195"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.134253 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/335dcb88-5e6e-46da-b481-a9c930592195-kube-api-access-rbwxm" (OuterVolumeSpecName: "kube-api-access-rbwxm") pod "335dcb88-5e6e-46da-b481-a9c930592195" (UID: "335dcb88-5e6e-46da-b481-a9c930592195"). InnerVolumeSpecName "kube-api-access-rbwxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.230774 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbwxm\" (UniqueName: \"kubernetes.io/projected/335dcb88-5e6e-46da-b481-a9c930592195-kube-api-access-rbwxm\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.230806 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/335dcb88-5e6e-46da-b481-a9c930592195-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.683809 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tttct" event={"ID":"335dcb88-5e6e-46da-b481-a9c930592195","Type":"ContainerDied","Data":"27b07b72f023e5f1978969295027dcf2587c8d8c17acbcd28ff5d5103169b528"} Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.683847 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27b07b72f023e5f1978969295027dcf2587c8d8c17acbcd28ff5d5103169b528" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.683898 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tttct" Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.686147 4725 generic.go:334] "Generic (PLEG): container finished" podID="40a2ae59-8725-42be-984a-739a82d476c5" containerID="3dd7af17451f066d14837c43b1002f8a5bf89ffb7082849f81e3d22e7a0b4981" exitCode=0 Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.686177 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bq24l" event={"ID":"40a2ae59-8725-42be-984a-739a82d476c5","Type":"ContainerDied","Data":"3dd7af17451f066d14837c43b1002f8a5bf89ffb7082849f81e3d22e7a0b4981"} Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.812050 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.813213 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="prometheus" containerID="cri-o://ae592e258d79cb09c8b03281de57f0d5e3f84ba2c19b27c712b185ab71c40bbf" gracePeriod=600 Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.813331 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="thanos-sidecar" containerID="cri-o://63eb8bad680b557d83340805c294e94eb30b1a95bb7f30a52be57834007a6586" gracePeriod=600 Feb 27 06:30:44 crc kubenswrapper[4725]: I0227 06:30:44.813510 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="config-reloader" containerID="cri-o://e63a4b09ef3287ec49a1d044f8543c0f227df5129cec65d081f31d4a607b09cb" gracePeriod=600 Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.697608 4725 generic.go:334] "Generic (PLEG): container finished" podID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerID="63eb8bad680b557d83340805c294e94eb30b1a95bb7f30a52be57834007a6586" exitCode=0 Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.697865 4725 generic.go:334] "Generic (PLEG): container finished" podID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerID="e63a4b09ef3287ec49a1d044f8543c0f227df5129cec65d081f31d4a607b09cb" exitCode=0 Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.697874 4725 generic.go:334] "Generic (PLEG): container finished" podID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerID="ae592e258d79cb09c8b03281de57f0d5e3f84ba2c19b27c712b185ab71c40bbf" exitCode=0 Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.697755 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerDied","Data":"63eb8bad680b557d83340805c294e94eb30b1a95bb7f30a52be57834007a6586"} Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.698047 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerDied","Data":"e63a4b09ef3287ec49a1d044f8543c0f227df5129cec65d081f31d4a607b09cb"} Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.698061 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerDied","Data":"ae592e258d79cb09c8b03281de57f0d5e3f84ba2c19b27c712b185ab71c40bbf"} Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.840121 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958375 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-tls-assets\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958444 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7421655b-5f80-4ec8-94f7-73a189f7460f-config-out\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958494 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-0\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958533 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-web-config\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958597 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-thanos-prometheus-http-client-file\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958652 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-config\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958676 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-2\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958698 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpzcl\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-kube-api-access-gpzcl\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958820 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.958850 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-1\") pod \"7421655b-5f80-4ec8-94f7-73a189f7460f\" (UID: \"7421655b-5f80-4ec8-94f7-73a189f7460f\") " Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.965167 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.965181 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.965676 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-config" (OuterVolumeSpecName: "config") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.965699 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.967057 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.969428 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-kube-api-access-gpzcl" (OuterVolumeSpecName: "kube-api-access-gpzcl") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "kube-api-access-gpzcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.969437 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7421655b-5f80-4ec8-94f7-73a189f7460f-config-out" (OuterVolumeSpecName: "config-out") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:30:45 crc kubenswrapper[4725]: I0227 06:30:45.981500 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.008693 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.012488 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-web-config" (OuterVolumeSpecName: "web-config") pod "7421655b-5f80-4ec8-94f7-73a189f7460f" (UID: "7421655b-5f80-4ec8-94f7-73a189f7460f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.060982 4725 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7421655b-5f80-4ec8-94f7-73a189f7460f-config-out\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061025 4725 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061038 4725 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-web-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061054 4725 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061067 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7421655b-5f80-4ec8-94f7-73a189f7460f-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061079 4725 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061091 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpzcl\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-kube-api-access-gpzcl\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061136 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") on node \"crc\" " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061163 4725 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7421655b-5f80-4ec8-94f7-73a189f7460f-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.061176 4725 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7421655b-5f80-4ec8-94f7-73a189f7460f-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.078896 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.081980 4725 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.082131 4725 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd") on node "crc" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.162535 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-scripts\") pod \"40a2ae59-8725-42be-984a-739a82d476c5\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.162636 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40a2ae59-8725-42be-984a-739a82d476c5-etc-swift\") pod \"40a2ae59-8725-42be-984a-739a82d476c5\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.162670 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-dispersionconf\") pod \"40a2ae59-8725-42be-984a-739a82d476c5\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.162735 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-swiftconf\") pod \"40a2ae59-8725-42be-984a-739a82d476c5\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.162775 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-combined-ca-bundle\") pod \"40a2ae59-8725-42be-984a-739a82d476c5\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.162846 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j65j\" (UniqueName: \"kubernetes.io/projected/40a2ae59-8725-42be-984a-739a82d476c5-kube-api-access-5j65j\") pod \"40a2ae59-8725-42be-984a-739a82d476c5\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.162871 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-ring-data-devices\") pod \"40a2ae59-8725-42be-984a-739a82d476c5\" (UID: \"40a2ae59-8725-42be-984a-739a82d476c5\") " Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.163409 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40a2ae59-8725-42be-984a-739a82d476c5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "40a2ae59-8725-42be-984a-739a82d476c5" (UID: "40a2ae59-8725-42be-984a-739a82d476c5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.163437 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "40a2ae59-8725-42be-984a-739a82d476c5" (UID: "40a2ae59-8725-42be-984a-739a82d476c5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.163665 4725 reconciler_common.go:293] "Volume detached for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.163683 4725 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40a2ae59-8725-42be-984a-739a82d476c5-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.163692 4725 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.175603 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40a2ae59-8725-42be-984a-739a82d476c5-kube-api-access-5j65j" (OuterVolumeSpecName: "kube-api-access-5j65j") pod "40a2ae59-8725-42be-984a-739a82d476c5" (UID: "40a2ae59-8725-42be-984a-739a82d476c5"). InnerVolumeSpecName "kube-api-access-5j65j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.188240 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "40a2ae59-8725-42be-984a-739a82d476c5" (UID: "40a2ae59-8725-42be-984a-739a82d476c5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.189405 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "40a2ae59-8725-42be-984a-739a82d476c5" (UID: "40a2ae59-8725-42be-984a-739a82d476c5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.195492 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40a2ae59-8725-42be-984a-739a82d476c5" (UID: "40a2ae59-8725-42be-984a-739a82d476c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.196984 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-scripts" (OuterVolumeSpecName: "scripts") pod "40a2ae59-8725-42be-984a-739a82d476c5" (UID: "40a2ae59-8725-42be-984a-739a82d476c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.268295 4725 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.268325 4725 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.268334 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a2ae59-8725-42be-984a-739a82d476c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.268344 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j65j\" (UniqueName: \"kubernetes.io/projected/40a2ae59-8725-42be-984a-739a82d476c5-kube-api-access-5j65j\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.268353 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a2ae59-8725-42be-984a-739a82d476c5-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.707306 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bq24l" event={"ID":"40a2ae59-8725-42be-984a-739a82d476c5","Type":"ContainerDied","Data":"36ca0f299c2944a6ba4d1303a189b6ef5b34d3bf647c5c7a1e6e35bee8079be6"} Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.707346 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ca0f299c2944a6ba4d1303a189b6ef5b34d3bf647c5c7a1e6e35bee8079be6" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.707407 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bq24l" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.710360 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7421655b-5f80-4ec8-94f7-73a189f7460f","Type":"ContainerDied","Data":"d7ac80521eff8a269104151bd837d9fc8c8c7ffa4a5e05740e496c86e814133d"} Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.710399 4725 scope.go:117] "RemoveContainer" containerID="63eb8bad680b557d83340805c294e94eb30b1a95bb7f30a52be57834007a6586" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.710524 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.739516 4725 scope.go:117] "RemoveContainer" containerID="e63a4b09ef3287ec49a1d044f8543c0f227df5129cec65d081f31d4a607b09cb" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.740729 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.767141 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.803473 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:30:46 crc kubenswrapper[4725]: E0227 06:30:46.804073 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="prometheus" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804086 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="prometheus" Feb 27 06:30:46 crc kubenswrapper[4725]: E0227 06:30:46.804104 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="thanos-sidecar" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804110 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="thanos-sidecar" Feb 27 06:30:46 crc kubenswrapper[4725]: E0227 06:30:46.804124 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="config-reloader" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804131 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="config-reloader" Feb 27 06:30:46 crc kubenswrapper[4725]: E0227 06:30:46.804144 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335dcb88-5e6e-46da-b481-a9c930592195" containerName="mariadb-account-create-update" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804151 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="335dcb88-5e6e-46da-b481-a9c930592195" containerName="mariadb-account-create-update" Feb 27 06:30:46 crc kubenswrapper[4725]: E0227 06:30:46.804175 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40a2ae59-8725-42be-984a-739a82d476c5" containerName="swift-ring-rebalance" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804181 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a2ae59-8725-42be-984a-739a82d476c5" containerName="swift-ring-rebalance" Feb 27 06:30:46 crc kubenswrapper[4725]: E0227 06:30:46.804197 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="init-config-reloader" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804203 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="init-config-reloader" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804899 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="prometheus" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804916 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="thanos-sidecar" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804929 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="335dcb88-5e6e-46da-b481-a9c930592195" containerName="mariadb-account-create-update" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804945 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" containerName="config-reloader" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.804958 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="40a2ae59-8725-42be-984a-739a82d476c5" containerName="swift-ring-rebalance" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.809001 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.822741 4725 scope.go:117] "RemoveContainer" containerID="ae592e258d79cb09c8b03281de57f0d5e3f84ba2c19b27c712b185ab71c40bbf" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.823818 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.823983 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.824121 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.825713 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l87pd" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.825780 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.825830 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.825915 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.828404 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.834736 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.839564 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.851754 4725 scope.go:117] "RemoveContainer" containerID="5e218148737f8bad9851446477af91e18255c885078c85ef97b521e222b40d2d" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884325 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884365 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884395 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884551 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884655 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884832 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884937 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.884972 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1729f58a-98a0-4128-8644-c1a7643f09c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.885013 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.885076 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.885175 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkhdg\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-kube-api-access-zkhdg\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.885208 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.885311 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986746 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkhdg\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-kube-api-access-zkhdg\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986791 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986829 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986862 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986885 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986911 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986935 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.986966 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.987039 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.987076 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.987093 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1729f58a-98a0-4128-8644-c1a7643f09c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.987113 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.987139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.987836 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.988330 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.988367 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.990932 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1729f58a-98a0-4128-8644-c1a7643f09c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.991842 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.992016 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.992834 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.992971 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.994422 4725 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.994513 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e30db3ebf2fd7ac6c73c4f03a68dbdb833990d29c0091afb2dd5fd8d7a51236/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.994923 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:46 crc kubenswrapper[4725]: I0227 06:30:46.994959 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.007627 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkhdg\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-kube-api-access-zkhdg\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.011923 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.030926 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.142403 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.414866 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tttct"] Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.422162 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tttct"] Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.644936 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:30:47 crc kubenswrapper[4725]: W0227 06:30:47.660773 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1729f58a_98a0_4128_8644_c1a7643f09c8.slice/crio-904f0e685b29974d82c20e11a2e80f4578184d3ffab44721b02b0f72630f2367 WatchSource:0}: Error finding container 904f0e685b29974d82c20e11a2e80f4578184d3ffab44721b02b0f72630f2367: Status 404 returned error can't find the container with id 904f0e685b29974d82c20e11a2e80f4578184d3ffab44721b02b0f72630f2367 Feb 27 06:30:47 crc kubenswrapper[4725]: I0227 06:30:47.720323 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerStarted","Data":"904f0e685b29974d82c20e11a2e80f4578184d3ffab44721b02b0f72630f2367"} Feb 27 06:30:48 crc kubenswrapper[4725]: I0227 06:30:48.269725 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="335dcb88-5e6e-46da-b481-a9c930592195" path="/var/lib/kubelet/pods/335dcb88-5e6e-46da-b481-a9c930592195/volumes" Feb 27 06:30:48 crc kubenswrapper[4725]: I0227 06:30:48.271833 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7421655b-5f80-4ec8-94f7-73a189f7460f" path="/var/lib/kubelet/pods/7421655b-5f80-4ec8-94f7-73a189f7460f/volumes" Feb 27 06:30:49 crc kubenswrapper[4725]: I0227 06:30:49.003064 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 27 06:30:50 crc kubenswrapper[4725]: I0227 06:30:50.162410 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0" Feb 27 06:30:50 crc kubenswrapper[4725]: I0227 06:30:50.423712 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Feb 27 06:30:50 crc kubenswrapper[4725]: I0227 06:30:50.737105 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Feb 27 06:30:50 crc kubenswrapper[4725]: I0227 06:30:50.748868 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerStarted","Data":"970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c"} Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.444881 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-g78th"] Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.446302 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g78th" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.448642 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.456107 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g78th"] Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.489831 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7f6\" (UniqueName: \"kubernetes.io/projected/dac537af-ac73-4c4e-947a-cc2120ccb158-kube-api-access-7t7f6\") pod \"root-account-create-update-g78th\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " pod="openstack/root-account-create-update-g78th" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.489880 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac537af-ac73-4c4e-947a-cc2120ccb158-operator-scripts\") pod \"root-account-create-update-g78th\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " pod="openstack/root-account-create-update-g78th" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.591919 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7f6\" (UniqueName: \"kubernetes.io/projected/dac537af-ac73-4c4e-947a-cc2120ccb158-kube-api-access-7t7f6\") pod \"root-account-create-update-g78th\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " pod="openstack/root-account-create-update-g78th" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.591983 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac537af-ac73-4c4e-947a-cc2120ccb158-operator-scripts\") pod \"root-account-create-update-g78th\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " pod="openstack/root-account-create-update-g78th" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.592987 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac537af-ac73-4c4e-947a-cc2120ccb158-operator-scripts\") pod \"root-account-create-update-g78th\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " pod="openstack/root-account-create-update-g78th" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.611305 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7f6\" (UniqueName: \"kubernetes.io/projected/dac537af-ac73-4c4e-947a-cc2120ccb158-kube-api-access-7t7f6\") pod \"root-account-create-update-g78th\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " pod="openstack/root-account-create-update-g78th" Feb 27 06:30:52 crc kubenswrapper[4725]: I0227 06:30:52.765940 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g78th" Feb 27 06:30:53 crc kubenswrapper[4725]: I0227 06:30:53.931211 4725 scope.go:117] "RemoveContainer" containerID="9fd2166b2f3cc3ee52a6ffcd3510fc2d804af69ad0cd03808ce4cca8dc90b614" Feb 27 06:30:57 crc kubenswrapper[4725]: I0227 06:30:57.655902 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g78th"] Feb 27 06:30:57 crc kubenswrapper[4725]: I0227 06:30:57.809379 4725 generic.go:334] "Generic (PLEG): container finished" podID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerID="970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c" exitCode=0 Feb 27 06:30:57 crc kubenswrapper[4725]: I0227 06:30:57.809480 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerDied","Data":"970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c"} Feb 27 06:30:57 crc kubenswrapper[4725]: I0227 06:30:57.811682 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g78th" event={"ID":"dac537af-ac73-4c4e-947a-cc2120ccb158","Type":"ContainerStarted","Data":"26ccfd0f078e62fc6d4642073074b7d790ef772c1cc39bcca3cf9f1e83489707"} Feb 27 06:30:57 crc kubenswrapper[4725]: I0227 06:30:57.811713 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g78th" event={"ID":"dac537af-ac73-4c4e-947a-cc2120ccb158","Type":"ContainerStarted","Data":"a7b125ec3e12754a2ecf456a43abc45e3e2d187709903e1206b5ed02759807d7"} Feb 27 06:30:57 crc kubenswrapper[4725]: I0227 06:30:57.875876 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-g78th" podStartSLOduration=5.875858003 podStartE2EDuration="5.875858003s" podCreationTimestamp="2026-02-27 06:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:30:57.862916589 +0000 UTC m=+1236.325537158" watchObservedRunningTime="2026-02-27 06:30:57.875858003 +0000 UTC m=+1236.338478562" Feb 27 06:30:58 crc kubenswrapper[4725]: I0227 06:30:58.883607 4725 generic.go:334] "Generic (PLEG): container finished" podID="dac537af-ac73-4c4e-947a-cc2120ccb158" containerID="26ccfd0f078e62fc6d4642073074b7d790ef772c1cc39bcca3cf9f1e83489707" exitCode=0 Feb 27 06:30:58 crc kubenswrapper[4725]: I0227 06:30:58.884083 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g78th" event={"ID":"dac537af-ac73-4c4e-947a-cc2120ccb158","Type":"ContainerDied","Data":"26ccfd0f078e62fc6d4642073074b7d790ef772c1cc39bcca3cf9f1e83489707"} Feb 27 06:30:58 crc kubenswrapper[4725]: I0227 06:30:58.888972 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98cpk" event={"ID":"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6","Type":"ContainerStarted","Data":"ba300aa808522a64bffe439dff7f8181467f6abc326bc4fa504121289b71f496"} Feb 27 06:30:58 crc kubenswrapper[4725]: I0227 06:30:58.901598 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerStarted","Data":"8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421"} Feb 27 06:30:58 crc kubenswrapper[4725]: I0227 06:30:58.930211 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-98cpk" podStartSLOduration=3.341521153 podStartE2EDuration="19.930191885s" podCreationTimestamp="2026-02-27 06:30:39 +0000 UTC" firstStartedPulling="2026-02-27 06:30:40.660335692 +0000 UTC m=+1219.122956261" lastFinishedPulling="2026-02-27 06:30:57.249006404 +0000 UTC m=+1235.711626993" observedRunningTime="2026-02-27 06:30:58.927040476 +0000 UTC m=+1237.389661075" watchObservedRunningTime="2026-02-27 06:30:58.930191885 +0000 UTC m=+1237.392812464" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.337757 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g78th" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.424543 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.434319 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t7f6\" (UniqueName: \"kubernetes.io/projected/dac537af-ac73-4c4e-947a-cc2120ccb158-kube-api-access-7t7f6\") pod \"dac537af-ac73-4c4e-947a-cc2120ccb158\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.434606 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac537af-ac73-4c4e-947a-cc2120ccb158-operator-scripts\") pod \"dac537af-ac73-4c4e-947a-cc2120ccb158\" (UID: \"dac537af-ac73-4c4e-947a-cc2120ccb158\") " Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.435562 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dac537af-ac73-4c4e-947a-cc2120ccb158-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dac537af-ac73-4c4e-947a-cc2120ccb158" (UID: "dac537af-ac73-4c4e-947a-cc2120ccb158"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.474665 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac537af-ac73-4c4e-947a-cc2120ccb158-kube-api-access-7t7f6" (OuterVolumeSpecName: "kube-api-access-7t7f6") pod "dac537af-ac73-4c4e-947a-cc2120ccb158" (UID: "dac537af-ac73-4c4e-947a-cc2120ccb158"). InnerVolumeSpecName "kube-api-access-7t7f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.537572 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac537af-ac73-4c4e-947a-cc2120ccb158-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.537624 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t7f6\" (UniqueName: \"kubernetes.io/projected/dac537af-ac73-4c4e-947a-cc2120ccb158-kube-api-access-7t7f6\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.741418 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.865831 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6nmms"] Feb 27 06:31:00 crc kubenswrapper[4725]: E0227 06:31:00.866607 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac537af-ac73-4c4e-947a-cc2120ccb158" containerName="mariadb-account-create-update" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.866621 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac537af-ac73-4c4e-947a-cc2120ccb158" containerName="mariadb-account-create-update" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.866786 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac537af-ac73-4c4e-947a-cc2120ccb158" containerName="mariadb-account-create-update" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.867373 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.880525 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6nmms"] Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.925715 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g78th" event={"ID":"dac537af-ac73-4c4e-947a-cc2120ccb158","Type":"ContainerDied","Data":"a7b125ec3e12754a2ecf456a43abc45e3e2d187709903e1206b5ed02759807d7"} Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.925754 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7b125ec3e12754a2ecf456a43abc45e3e2d187709903e1206b5ed02759807d7" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.925819 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g78th" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.971746 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-877a-account-create-update-cn8jq"] Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.972705 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.975886 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 27 06:31:00 crc kubenswrapper[4725]: I0227 06:31:00.989710 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-877a-account-create-update-cn8jq"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.050156 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe3bf48-3f87-4630-b879-65a1614acb41-operator-scripts\") pod \"cinder-db-create-6nmms\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.050206 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76x9r\" (UniqueName: \"kubernetes.io/projected/fbe3bf48-3f87-4630-b879-65a1614acb41-kube-api-access-76x9r\") pod \"cinder-db-create-6nmms\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.131360 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vvwlj"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.132369 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.134523 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4dkt" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.135819 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.136004 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.136162 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.142875 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vvwlj"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.153736 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3424ee2-48c0-4904-a435-0acde28a6043-operator-scripts\") pod \"cinder-877a-account-create-update-cn8jq\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.153788 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmwx4\" (UniqueName: \"kubernetes.io/projected/a3424ee2-48c0-4904-a435-0acde28a6043-kube-api-access-pmwx4\") pod \"cinder-877a-account-create-update-cn8jq\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.153816 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe3bf48-3f87-4630-b879-65a1614acb41-operator-scripts\") pod \"cinder-db-create-6nmms\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.153853 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76x9r\" (UniqueName: \"kubernetes.io/projected/fbe3bf48-3f87-4630-b879-65a1614acb41-kube-api-access-76x9r\") pod \"cinder-db-create-6nmms\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.154793 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe3bf48-3f87-4630-b879-65a1614acb41-operator-scripts\") pod \"cinder-db-create-6nmms\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.183396 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76x9r\" (UniqueName: \"kubernetes.io/projected/fbe3bf48-3f87-4630-b879-65a1614acb41-kube-api-access-76x9r\") pod \"cinder-db-create-6nmms\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.190320 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.210331 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fsxct"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.211425 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.232799 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fsxct"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.249440 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-063b-account-create-update-tz8vx"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.252110 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.256033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-combined-ca-bundle\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.256079 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-config-data\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.256131 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdtt\" (UniqueName: \"kubernetes.io/projected/df17b144-75a9-44a8-a2b4-4694687dc01f-kube-api-access-fzdtt\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.256175 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3424ee2-48c0-4904-a435-0acde28a6043-operator-scripts\") pod \"cinder-877a-account-create-update-cn8jq\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.256198 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmwx4\" (UniqueName: \"kubernetes.io/projected/a3424ee2-48c0-4904-a435-0acde28a6043-kube-api-access-pmwx4\") pod \"cinder-877a-account-create-update-cn8jq\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.257041 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3424ee2-48c0-4904-a435-0acde28a6043-operator-scripts\") pod \"cinder-877a-account-create-update-cn8jq\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.259873 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.268341 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-063b-account-create-update-tz8vx"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.302106 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2q7cz"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.304220 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.310357 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmwx4\" (UniqueName: \"kubernetes.io/projected/a3424ee2-48c0-4904-a435-0acde28a6043-kube-api-access-pmwx4\") pod \"cinder-877a-account-create-update-cn8jq\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.331779 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2q7cz"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358440 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-combined-ca-bundle\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358490 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvwd\" (UniqueName: \"kubernetes.io/projected/1cd9edff-02fe-4305-81b9-ee8fbea78f20-kube-api-access-kzvwd\") pod \"neutron-db-create-2q7cz\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358522 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78811b9d-d814-4266-911b-9c466a6df5e4-operator-scripts\") pod \"barbican-063b-account-create-update-tz8vx\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358566 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-config-data\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358625 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd9edff-02fe-4305-81b9-ee8fbea78f20-operator-scripts\") pod \"neutron-db-create-2q7cz\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358653 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94647b2a-f9bf-498e-bb33-ca18ee334284-operator-scripts\") pod \"barbican-db-create-fsxct\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358699 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdtt\" (UniqueName: \"kubernetes.io/projected/df17b144-75a9-44a8-a2b4-4694687dc01f-kube-api-access-fzdtt\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358731 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xng\" (UniqueName: \"kubernetes.io/projected/78811b9d-d814-4266-911b-9c466a6df5e4-kube-api-access-d2xng\") pod \"barbican-063b-account-create-update-tz8vx\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.358790 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfrmv\" (UniqueName: \"kubernetes.io/projected/94647b2a-f9bf-498e-bb33-ca18ee334284-kube-api-access-zfrmv\") pod \"barbican-db-create-fsxct\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.370022 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-config-data\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.372012 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-combined-ca-bundle\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.394770 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e038-account-create-update-7k69g"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.395863 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.396012 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdtt\" (UniqueName: \"kubernetes.io/projected/df17b144-75a9-44a8-a2b4-4694687dc01f-kube-api-access-fzdtt\") pod \"keystone-db-sync-vvwlj\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.406531 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.409159 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-c7txp"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.410253 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.411837 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.413089 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-m7t9c" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.431414 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e038-account-create-update-7k69g"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.459340 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.462219 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xng\" (UniqueName: \"kubernetes.io/projected/78811b9d-d814-4266-911b-9c466a6df5e4-kube-api-access-d2xng\") pod \"barbican-063b-account-create-update-tz8vx\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.462252 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8fh\" (UniqueName: \"kubernetes.io/projected/e6df5742-cb9e-4731-aa27-0efe73e9e61a-kube-api-access-mb8fh\") pod \"neutron-e038-account-create-update-7k69g\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.462794 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-c7txp"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470448 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfrmv\" (UniqueName: \"kubernetes.io/projected/94647b2a-f9bf-498e-bb33-ca18ee334284-kube-api-access-zfrmv\") pod \"barbican-db-create-fsxct\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470655 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-combined-ca-bundle\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470700 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8h5j\" (UniqueName: \"kubernetes.io/projected/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-kube-api-access-l8h5j\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470721 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-db-sync-config-data\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470786 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6df5742-cb9e-4731-aa27-0efe73e9e61a-operator-scripts\") pod \"neutron-e038-account-create-update-7k69g\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470857 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvwd\" (UniqueName: \"kubernetes.io/projected/1cd9edff-02fe-4305-81b9-ee8fbea78f20-kube-api-access-kzvwd\") pod \"neutron-db-create-2q7cz\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470882 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78811b9d-d814-4266-911b-9c466a6df5e4-operator-scripts\") pod \"barbican-063b-account-create-update-tz8vx\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.470910 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-config-data\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.471018 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd9edff-02fe-4305-81b9-ee8fbea78f20-operator-scripts\") pod \"neutron-db-create-2q7cz\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.471047 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94647b2a-f9bf-498e-bb33-ca18ee334284-operator-scripts\") pod \"barbican-db-create-fsxct\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.471715 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94647b2a-f9bf-498e-bb33-ca18ee334284-operator-scripts\") pod \"barbican-db-create-fsxct\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.471963 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78811b9d-d814-4266-911b-9c466a6df5e4-operator-scripts\") pod \"barbican-063b-account-create-update-tz8vx\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.472376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd9edff-02fe-4305-81b9-ee8fbea78f20-operator-scripts\") pod \"neutron-db-create-2q7cz\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.496999 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xng\" (UniqueName: \"kubernetes.io/projected/78811b9d-d814-4266-911b-9c466a6df5e4-kube-api-access-d2xng\") pod \"barbican-063b-account-create-update-tz8vx\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.504308 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfrmv\" (UniqueName: \"kubernetes.io/projected/94647b2a-f9bf-498e-bb33-ca18ee334284-kube-api-access-zfrmv\") pod \"barbican-db-create-fsxct\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.506039 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvwd\" (UniqueName: \"kubernetes.io/projected/1cd9edff-02fe-4305-81b9-ee8fbea78f20-kube-api-access-kzvwd\") pod \"neutron-db-create-2q7cz\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.572579 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8h5j\" (UniqueName: \"kubernetes.io/projected/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-kube-api-access-l8h5j\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.572642 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-db-sync-config-data\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.572696 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6df5742-cb9e-4731-aa27-0efe73e9e61a-operator-scripts\") pod \"neutron-e038-account-create-update-7k69g\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.572779 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-config-data\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.572899 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8fh\" (UniqueName: \"kubernetes.io/projected/e6df5742-cb9e-4731-aa27-0efe73e9e61a-kube-api-access-mb8fh\") pod \"neutron-e038-account-create-update-7k69g\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.572954 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-combined-ca-bundle\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.576481 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6df5742-cb9e-4731-aa27-0efe73e9e61a-operator-scripts\") pod \"neutron-e038-account-create-update-7k69g\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.581982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-config-data\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.583232 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-combined-ca-bundle\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.583581 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-db-sync-config-data\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.589878 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.592806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8h5j\" (UniqueName: \"kubernetes.io/projected/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-kube-api-access-l8h5j\") pod \"watcher-db-sync-c7txp\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.610556 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8fh\" (UniqueName: \"kubernetes.io/projected/e6df5742-cb9e-4731-aa27-0efe73e9e61a-kube-api-access-mb8fh\") pod \"neutron-e038-account-create-update-7k69g\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.634919 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.768965 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.801851 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.840896 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6nmms"] Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.846441 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:01 crc kubenswrapper[4725]: I0227 06:31:01.855367 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.010566 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6nmms" event={"ID":"fbe3bf48-3f87-4630-b879-65a1614acb41","Type":"ContainerStarted","Data":"a330879bfdc11bd6282d6434d84aa00aa12552969120cfa279c998b3d99197c4"} Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.022895 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerStarted","Data":"44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98"} Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.022954 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerStarted","Data":"cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac"} Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.034253 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vvwlj"] Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.056773 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fsxct"] Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.070994 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.070976326 podStartE2EDuration="16.070976326s" podCreationTimestamp="2026-02-27 06:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:02.067585671 +0000 UTC m=+1240.530206250" watchObservedRunningTime="2026-02-27 06:31:02.070976326 +0000 UTC m=+1240.533596895" Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.094089 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-063b-account-create-update-tz8vx"] Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.124535 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-877a-account-create-update-cn8jq"] Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.144756 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.144786 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 06:31:02 crc kubenswrapper[4725]: W0227 06:31:02.151116 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78811b9d_d814_4266_911b_9c466a6df5e4.slice/crio-aafe6c871b37cf9a703b1c7b37141260c203ce8b8800ebbbdc811c1323f291ca WatchSource:0}: Error finding container aafe6c871b37cf9a703b1c7b37141260c203ce8b8800ebbbdc811c1323f291ca: Status 404 returned error can't find the container with id aafe6c871b37cf9a703b1c7b37141260c203ce8b8800ebbbdc811c1323f291ca Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.161973 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.170991 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2q7cz"] Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.589834 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e038-account-create-update-7k69g"] Feb 27 06:31:02 crc kubenswrapper[4725]: I0227 06:31:02.841046 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-c7txp"] Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.036863 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fsxct" event={"ID":"94647b2a-f9bf-498e-bb33-ca18ee334284","Type":"ContainerStarted","Data":"ecf5e7d116b70a689552aedc8937b97ef37d881dc82aa7e9d0b29abbc1ac8ac8"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.036907 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fsxct" event={"ID":"94647b2a-f9bf-498e-bb33-ca18ee334284","Type":"ContainerStarted","Data":"fc48f75b1d423319ea9b5d9a2cc880eda48ad7e41f466afe897bfe8c5cf3c7e2"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.039749 4725 generic.go:334] "Generic (PLEG): container finished" podID="fbe3bf48-3f87-4630-b879-65a1614acb41" containerID="9f1a737ef7eb69e0b3c45c4647b1d010e348982a9badebd73a266343fc8663c9" exitCode=0 Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.039830 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6nmms" event={"ID":"fbe3bf48-3f87-4630-b879-65a1614acb41","Type":"ContainerDied","Data":"9f1a737ef7eb69e0b3c45c4647b1d010e348982a9badebd73a266343fc8663c9"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.040997 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-c7txp" event={"ID":"be03cf3c-5ffa-40cd-9a69-cb386068bc2c","Type":"ContainerStarted","Data":"6f3a7524a3feebd93ef76f37651849652d8e7ce27597c0474ef743bc255d7a14"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.042956 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2q7cz" event={"ID":"1cd9edff-02fe-4305-81b9-ee8fbea78f20","Type":"ContainerStarted","Data":"47bbf6f0345636425feb69b0475a99faa69fcbb5d7cc1c7b03e5806a70973cab"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.042981 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2q7cz" event={"ID":"1cd9edff-02fe-4305-81b9-ee8fbea78f20","Type":"ContainerStarted","Data":"7c46b89839572437c2e3212537d3025a73031a43afd97a4a62955ed278755a15"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.047205 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e038-account-create-update-7k69g" event={"ID":"e6df5742-cb9e-4731-aa27-0efe73e9e61a","Type":"ContainerStarted","Data":"b58ca8ecb6430f2c11c2126e975e12898b529c6d118421061806aab48926540d"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.047265 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e038-account-create-update-7k69g" event={"ID":"e6df5742-cb9e-4731-aa27-0efe73e9e61a","Type":"ContainerStarted","Data":"62e7cc3f48a813361d11b1af6fb5600903a9c39b26a6ea401e8774669fa493f0"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.051235 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-fsxct" podStartSLOduration=2.051221355 podStartE2EDuration="2.051221355s" podCreationTimestamp="2026-02-27 06:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:03.049108035 +0000 UTC m=+1241.511728624" watchObservedRunningTime="2026-02-27 06:31:03.051221355 +0000 UTC m=+1241.513841924" Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.053829 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-877a-account-create-update-cn8jq" event={"ID":"a3424ee2-48c0-4904-a435-0acde28a6043","Type":"ContainerStarted","Data":"0a5141fe3ac64466dc1c51f41f414fc9e051846f3141d391347f54f79c61d3ea"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.053877 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-877a-account-create-update-cn8jq" event={"ID":"a3424ee2-48c0-4904-a435-0acde28a6043","Type":"ContainerStarted","Data":"de3b4a42306035edb1eefc0fc1edae9da6309868ab39fc6a263aeeb5f30ca6df"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.064532 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vvwlj" event={"ID":"df17b144-75a9-44a8-a2b4-4694687dc01f","Type":"ContainerStarted","Data":"db1e5f7c7578a0158f307ac480f2ad96887cbe37373fe12f0297813d754c1a67"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.067652 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-063b-account-create-update-tz8vx" event={"ID":"78811b9d-d814-4266-911b-9c466a6df5e4","Type":"ContainerStarted","Data":"5aa334da6e28dba02b940326dbb9e9753d6bb49e62dd188b73f4ab0f088d367e"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.067710 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-063b-account-create-update-tz8vx" event={"ID":"78811b9d-d814-4266-911b-9c466a6df5e4","Type":"ContainerStarted","Data":"aafe6c871b37cf9a703b1c7b37141260c203ce8b8800ebbbdc811c1323f291ca"} Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.069564 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-e038-account-create-update-7k69g" podStartSLOduration=2.06954969 podStartE2EDuration="2.06954969s" podCreationTimestamp="2026-02-27 06:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:03.067495972 +0000 UTC m=+1241.530116562" watchObservedRunningTime="2026-02-27 06:31:03.06954969 +0000 UTC m=+1241.532170259" Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.090317 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.091199 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-2q7cz" podStartSLOduration=2.091177669 podStartE2EDuration="2.091177669s" podCreationTimestamp="2026-02-27 06:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:03.081856396 +0000 UTC m=+1241.544476955" watchObservedRunningTime="2026-02-27 06:31:03.091177669 +0000 UTC m=+1241.553798228" Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.120886 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-063b-account-create-update-tz8vx" podStartSLOduration=2.120874024 podStartE2EDuration="2.120874024s" podCreationTimestamp="2026-02-27 06:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:03.118273241 +0000 UTC m=+1241.580893830" watchObservedRunningTime="2026-02-27 06:31:03.120874024 +0000 UTC m=+1241.583494583" Feb 27 06:31:03 crc kubenswrapper[4725]: I0227 06:31:03.227644 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-877a-account-create-update-cn8jq" podStartSLOduration=3.227610206 podStartE2EDuration="3.227610206s" podCreationTimestamp="2026-02-27 06:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:03.135056143 +0000 UTC m=+1241.597676702" watchObservedRunningTime="2026-02-27 06:31:03.227610206 +0000 UTC m=+1241.690230795" Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.091776 4725 generic.go:334] "Generic (PLEG): container finished" podID="a3424ee2-48c0-4904-a435-0acde28a6043" containerID="0a5141fe3ac64466dc1c51f41f414fc9e051846f3141d391347f54f79c61d3ea" exitCode=0 Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.091857 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-877a-account-create-update-cn8jq" event={"ID":"a3424ee2-48c0-4904-a435-0acde28a6043","Type":"ContainerDied","Data":"0a5141fe3ac64466dc1c51f41f414fc9e051846f3141d391347f54f79c61d3ea"} Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.094925 4725 generic.go:334] "Generic (PLEG): container finished" podID="78811b9d-d814-4266-911b-9c466a6df5e4" containerID="5aa334da6e28dba02b940326dbb9e9753d6bb49e62dd188b73f4ab0f088d367e" exitCode=0 Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.095013 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-063b-account-create-update-tz8vx" event={"ID":"78811b9d-d814-4266-911b-9c466a6df5e4","Type":"ContainerDied","Data":"5aa334da6e28dba02b940326dbb9e9753d6bb49e62dd188b73f4ab0f088d367e"} Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.097487 4725 generic.go:334] "Generic (PLEG): container finished" podID="94647b2a-f9bf-498e-bb33-ca18ee334284" containerID="ecf5e7d116b70a689552aedc8937b97ef37d881dc82aa7e9d0b29abbc1ac8ac8" exitCode=0 Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.097711 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fsxct" event={"ID":"94647b2a-f9bf-498e-bb33-ca18ee334284","Type":"ContainerDied","Data":"ecf5e7d116b70a689552aedc8937b97ef37d881dc82aa7e9d0b29abbc1ac8ac8"} Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.099675 4725 generic.go:334] "Generic (PLEG): container finished" podID="1cd9edff-02fe-4305-81b9-ee8fbea78f20" containerID="47bbf6f0345636425feb69b0475a99faa69fcbb5d7cc1c7b03e5806a70973cab" exitCode=0 Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.099736 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2q7cz" event={"ID":"1cd9edff-02fe-4305-81b9-ee8fbea78f20","Type":"ContainerDied","Data":"47bbf6f0345636425feb69b0475a99faa69fcbb5d7cc1c7b03e5806a70973cab"} Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.100953 4725 generic.go:334] "Generic (PLEG): container finished" podID="e6df5742-cb9e-4731-aa27-0efe73e9e61a" containerID="b58ca8ecb6430f2c11c2126e975e12898b529c6d118421061806aab48926540d" exitCode=0 Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.101159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e038-account-create-update-7k69g" event={"ID":"e6df5742-cb9e-4731-aa27-0efe73e9e61a","Type":"ContainerDied","Data":"b58ca8ecb6430f2c11c2126e975e12898b529c6d118421061806aab48926540d"} Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.457129 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.547842 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76x9r\" (UniqueName: \"kubernetes.io/projected/fbe3bf48-3f87-4630-b879-65a1614acb41-kube-api-access-76x9r\") pod \"fbe3bf48-3f87-4630-b879-65a1614acb41\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.547941 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe3bf48-3f87-4630-b879-65a1614acb41-operator-scripts\") pod \"fbe3bf48-3f87-4630-b879-65a1614acb41\" (UID: \"fbe3bf48-3f87-4630-b879-65a1614acb41\") " Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.548799 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbe3bf48-3f87-4630-b879-65a1614acb41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbe3bf48-3f87-4630-b879-65a1614acb41" (UID: "fbe3bf48-3f87-4630-b879-65a1614acb41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.554848 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe3bf48-3f87-4630-b879-65a1614acb41-kube-api-access-76x9r" (OuterVolumeSpecName: "kube-api-access-76x9r") pod "fbe3bf48-3f87-4630-b879-65a1614acb41" (UID: "fbe3bf48-3f87-4630-b879-65a1614acb41"). InnerVolumeSpecName "kube-api-access-76x9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.649896 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76x9r\" (UniqueName: \"kubernetes.io/projected/fbe3bf48-3f87-4630-b879-65a1614acb41-kube-api-access-76x9r\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:04 crc kubenswrapper[4725]: I0227 06:31:04.649936 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbe3bf48-3f87-4630-b879-65a1614acb41-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:05 crc kubenswrapper[4725]: I0227 06:31:05.111948 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6nmms" event={"ID":"fbe3bf48-3f87-4630-b879-65a1614acb41","Type":"ContainerDied","Data":"a330879bfdc11bd6282d6434d84aa00aa12552969120cfa279c998b3d99197c4"} Feb 27 06:31:05 crc kubenswrapper[4725]: I0227 06:31:05.112294 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a330879bfdc11bd6282d6434d84aa00aa12552969120cfa279c998b3d99197c4" Feb 27 06:31:05 crc kubenswrapper[4725]: I0227 06:31:05.112061 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6nmms" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.316171 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.325784 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.334838 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.348172 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.363334 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.442157 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmwx4\" (UniqueName: \"kubernetes.io/projected/a3424ee2-48c0-4904-a435-0acde28a6043-kube-api-access-pmwx4\") pod \"a3424ee2-48c0-4904-a435-0acde28a6043\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.442258 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd9edff-02fe-4305-81b9-ee8fbea78f20-operator-scripts\") pod \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.442323 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzvwd\" (UniqueName: \"kubernetes.io/projected/1cd9edff-02fe-4305-81b9-ee8fbea78f20-kube-api-access-kzvwd\") pod \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\" (UID: \"1cd9edff-02fe-4305-81b9-ee8fbea78f20\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.442368 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2xng\" (UniqueName: \"kubernetes.io/projected/78811b9d-d814-4266-911b-9c466a6df5e4-kube-api-access-d2xng\") pod \"78811b9d-d814-4266-911b-9c466a6df5e4\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.442470 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78811b9d-d814-4266-911b-9c466a6df5e4-operator-scripts\") pod \"78811b9d-d814-4266-911b-9c466a6df5e4\" (UID: \"78811b9d-d814-4266-911b-9c466a6df5e4\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.442498 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3424ee2-48c0-4904-a435-0acde28a6043-operator-scripts\") pod \"a3424ee2-48c0-4904-a435-0acde28a6043\" (UID: \"a3424ee2-48c0-4904-a435-0acde28a6043\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.443535 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78811b9d-d814-4266-911b-9c466a6df5e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78811b9d-d814-4266-911b-9c466a6df5e4" (UID: "78811b9d-d814-4266-911b-9c466a6df5e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.443528 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3424ee2-48c0-4904-a435-0acde28a6043-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3424ee2-48c0-4904-a435-0acde28a6043" (UID: "a3424ee2-48c0-4904-a435-0acde28a6043"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.443679 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cd9edff-02fe-4305-81b9-ee8fbea78f20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cd9edff-02fe-4305-81b9-ee8fbea78f20" (UID: "1cd9edff-02fe-4305-81b9-ee8fbea78f20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.443894 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd9edff-02fe-4305-81b9-ee8fbea78f20-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.443948 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78811b9d-d814-4266-911b-9c466a6df5e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.444088 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3424ee2-48c0-4904-a435-0acde28a6043-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.448186 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3424ee2-48c0-4904-a435-0acde28a6043-kube-api-access-pmwx4" (OuterVolumeSpecName: "kube-api-access-pmwx4") pod "a3424ee2-48c0-4904-a435-0acde28a6043" (UID: "a3424ee2-48c0-4904-a435-0acde28a6043"). InnerVolumeSpecName "kube-api-access-pmwx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.448766 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd9edff-02fe-4305-81b9-ee8fbea78f20-kube-api-access-kzvwd" (OuterVolumeSpecName: "kube-api-access-kzvwd") pod "1cd9edff-02fe-4305-81b9-ee8fbea78f20" (UID: "1cd9edff-02fe-4305-81b9-ee8fbea78f20"). InnerVolumeSpecName "kube-api-access-kzvwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.451185 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78811b9d-d814-4266-911b-9c466a6df5e4-kube-api-access-d2xng" (OuterVolumeSpecName: "kube-api-access-d2xng") pod "78811b9d-d814-4266-911b-9c466a6df5e4" (UID: "78811b9d-d814-4266-911b-9c466a6df5e4"). InnerVolumeSpecName "kube-api-access-d2xng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.544754 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb8fh\" (UniqueName: \"kubernetes.io/projected/e6df5742-cb9e-4731-aa27-0efe73e9e61a-kube-api-access-mb8fh\") pod \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545012 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfrmv\" (UniqueName: \"kubernetes.io/projected/94647b2a-f9bf-498e-bb33-ca18ee334284-kube-api-access-zfrmv\") pod \"94647b2a-f9bf-498e-bb33-ca18ee334284\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545100 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6df5742-cb9e-4731-aa27-0efe73e9e61a-operator-scripts\") pod \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\" (UID: \"e6df5742-cb9e-4731-aa27-0efe73e9e61a\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545118 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94647b2a-f9bf-498e-bb33-ca18ee334284-operator-scripts\") pod \"94647b2a-f9bf-498e-bb33-ca18ee334284\" (UID: \"94647b2a-f9bf-498e-bb33-ca18ee334284\") " Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545538 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmwx4\" (UniqueName: \"kubernetes.io/projected/a3424ee2-48c0-4904-a435-0acde28a6043-kube-api-access-pmwx4\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545554 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzvwd\" (UniqueName: \"kubernetes.io/projected/1cd9edff-02fe-4305-81b9-ee8fbea78f20-kube-api-access-kzvwd\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545563 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2xng\" (UniqueName: \"kubernetes.io/projected/78811b9d-d814-4266-911b-9c466a6df5e4-kube-api-access-d2xng\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545797 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6df5742-cb9e-4731-aa27-0efe73e9e61a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6df5742-cb9e-4731-aa27-0efe73e9e61a" (UID: "e6df5742-cb9e-4731-aa27-0efe73e9e61a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.545798 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94647b2a-f9bf-498e-bb33-ca18ee334284-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94647b2a-f9bf-498e-bb33-ca18ee334284" (UID: "94647b2a-f9bf-498e-bb33-ca18ee334284"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.548370 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94647b2a-f9bf-498e-bb33-ca18ee334284-kube-api-access-zfrmv" (OuterVolumeSpecName: "kube-api-access-zfrmv") pod "94647b2a-f9bf-498e-bb33-ca18ee334284" (UID: "94647b2a-f9bf-498e-bb33-ca18ee334284"). InnerVolumeSpecName "kube-api-access-zfrmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.548647 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6df5742-cb9e-4731-aa27-0efe73e9e61a-kube-api-access-mb8fh" (OuterVolumeSpecName: "kube-api-access-mb8fh") pod "e6df5742-cb9e-4731-aa27-0efe73e9e61a" (UID: "e6df5742-cb9e-4731-aa27-0efe73e9e61a"). InnerVolumeSpecName "kube-api-access-mb8fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.647528 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6df5742-cb9e-4731-aa27-0efe73e9e61a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.647564 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94647b2a-f9bf-498e-bb33-ca18ee334284-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.647577 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb8fh\" (UniqueName: \"kubernetes.io/projected/e6df5742-cb9e-4731-aa27-0efe73e9e61a-kube-api-access-mb8fh\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:08 crc kubenswrapper[4725]: I0227 06:31:08.647593 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfrmv\" (UniqueName: \"kubernetes.io/projected/94647b2a-f9bf-498e-bb33-ca18ee334284-kube-api-access-zfrmv\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.179433 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e038-account-create-update-7k69g" event={"ID":"e6df5742-cb9e-4731-aa27-0efe73e9e61a","Type":"ContainerDied","Data":"62e7cc3f48a813361d11b1af6fb5600903a9c39b26a6ea401e8774669fa493f0"} Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.179463 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e038-account-create-update-7k69g" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.179475 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62e7cc3f48a813361d11b1af6fb5600903a9c39b26a6ea401e8774669fa493f0" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.181049 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-877a-account-create-update-cn8jq" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.181043 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-877a-account-create-update-cn8jq" event={"ID":"a3424ee2-48c0-4904-a435-0acde28a6043","Type":"ContainerDied","Data":"de3b4a42306035edb1eefc0fc1edae9da6309868ab39fc6a263aeeb5f30ca6df"} Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.181115 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de3b4a42306035edb1eefc0fc1edae9da6309868ab39fc6a263aeeb5f30ca6df" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.183445 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-063b-account-create-update-tz8vx" event={"ID":"78811b9d-d814-4266-911b-9c466a6df5e4","Type":"ContainerDied","Data":"aafe6c871b37cf9a703b1c7b37141260c203ce8b8800ebbbdc811c1323f291ca"} Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.183478 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aafe6c871b37cf9a703b1c7b37141260c203ce8b8800ebbbdc811c1323f291ca" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.183548 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-063b-account-create-update-tz8vx" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.186416 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fsxct" event={"ID":"94647b2a-f9bf-498e-bb33-ca18ee334284","Type":"ContainerDied","Data":"fc48f75b1d423319ea9b5d9a2cc880eda48ad7e41f466afe897bfe8c5cf3c7e2"} Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.186455 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc48f75b1d423319ea9b5d9a2cc880eda48ad7e41f466afe897bfe8c5cf3c7e2" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.186512 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fsxct" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.199140 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2q7cz" event={"ID":"1cd9edff-02fe-4305-81b9-ee8fbea78f20","Type":"ContainerDied","Data":"7c46b89839572437c2e3212537d3025a73031a43afd97a4a62955ed278755a15"} Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.199186 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c46b89839572437c2e3212537d3025a73031a43afd97a4a62955ed278755a15" Feb 27 06:31:09 crc kubenswrapper[4725]: I0227 06:31:09.199205 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2q7cz" Feb 27 06:31:10 crc kubenswrapper[4725]: I0227 06:31:10.212018 4725 generic.go:334] "Generic (PLEG): container finished" podID="dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" containerID="ba300aa808522a64bffe439dff7f8181467f6abc326bc4fa504121289b71f496" exitCode=0 Feb 27 06:31:10 crc kubenswrapper[4725]: I0227 06:31:10.212071 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98cpk" event={"ID":"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6","Type":"ContainerDied","Data":"ba300aa808522a64bffe439dff7f8181467f6abc326bc4fa504121289b71f496"} Feb 27 06:31:11 crc kubenswrapper[4725]: I0227 06:31:11.190124 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:31:11 crc kubenswrapper[4725]: I0227 06:31:11.203332 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/872eba69-b1d2-4028-b65f-b70fa14daeb0-etc-swift\") pod \"swift-storage-0\" (UID: \"872eba69-b1d2-4028-b65f-b70fa14daeb0\") " pod="openstack/swift-storage-0" Feb 27 06:31:11 crc kubenswrapper[4725]: I0227 06:31:11.414150 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 06:31:11 crc kubenswrapper[4725]: I0227 06:31:11.986499 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98cpk" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.004346 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-config-data\") pod \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.069907 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-config-data" (OuterVolumeSpecName: "config-data") pod "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" (UID: "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.105686 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-combined-ca-bundle\") pod \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.105782 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-db-sync-config-data\") pod \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.105821 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbhsz\" (UniqueName: \"kubernetes.io/projected/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-kube-api-access-sbhsz\") pod \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\" (UID: \"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6\") " Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.106250 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.110543 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-kube-api-access-sbhsz" (OuterVolumeSpecName: "kube-api-access-sbhsz") pod "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" (UID: "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6"). InnerVolumeSpecName "kube-api-access-sbhsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.111526 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" (UID: "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.134063 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" (UID: "dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.209122 4725 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.209157 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbhsz\" (UniqueName: \"kubernetes.io/projected/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-kube-api-access-sbhsz\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.209172 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.239149 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-98cpk" event={"ID":"dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6","Type":"ContainerDied","Data":"d90ae8f9f997db791a66cb3b1457023788e90dfa980c2ca7b3fda060049df9e0"} Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.239194 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d90ae8f9f997db791a66cb3b1457023788e90dfa980c2ca7b3fda060049df9e0" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.239254 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-98cpk" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.726335 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d57f9cf89-kzkgz"] Feb 27 06:31:12 crc kubenswrapper[4725]: E0227 06:31:12.727300 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe3bf48-3f87-4630-b879-65a1614acb41" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727322 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe3bf48-3f87-4630-b879-65a1614acb41" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: E0227 06:31:12.727348 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd9edff-02fe-4305-81b9-ee8fbea78f20" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727357 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd9edff-02fe-4305-81b9-ee8fbea78f20" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: E0227 06:31:12.727372 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94647b2a-f9bf-498e-bb33-ca18ee334284" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727380 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="94647b2a-f9bf-498e-bb33-ca18ee334284" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: E0227 06:31:12.727393 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" containerName="glance-db-sync" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727401 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" containerName="glance-db-sync" Feb 27 06:31:12 crc kubenswrapper[4725]: E0227 06:31:12.727410 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78811b9d-d814-4266-911b-9c466a6df5e4" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727416 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="78811b9d-d814-4266-911b-9c466a6df5e4" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: E0227 06:31:12.727426 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3424ee2-48c0-4904-a435-0acde28a6043" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727432 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3424ee2-48c0-4904-a435-0acde28a6043" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: E0227 06:31:12.727441 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6df5742-cb9e-4731-aa27-0efe73e9e61a" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727447 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6df5742-cb9e-4731-aa27-0efe73e9e61a" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727591 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6df5742-cb9e-4731-aa27-0efe73e9e61a" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727609 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="94647b2a-f9bf-498e-bb33-ca18ee334284" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727616 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3424ee2-48c0-4904-a435-0acde28a6043" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727625 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe3bf48-3f87-4630-b879-65a1614acb41" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727632 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd9edff-02fe-4305-81b9-ee8fbea78f20" containerName="mariadb-database-create" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727645 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="78811b9d-d814-4266-911b-9c466a6df5e4" containerName="mariadb-account-create-update" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.727653 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" containerName="glance-db-sync" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.728514 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.733150 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-nb\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.733302 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-dns-svc\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.733337 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcnl\" (UniqueName: \"kubernetes.io/projected/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-kube-api-access-gqcnl\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.733354 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-config\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.733514 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-sb\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.780347 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d57f9cf89-kzkgz"] Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.834567 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-nb\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.834625 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-dns-svc\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.834669 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcnl\" (UniqueName: \"kubernetes.io/projected/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-kube-api-access-gqcnl\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.834689 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-config\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.834748 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-sb\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.836504 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-nb\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.836613 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-config\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.838374 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-sb\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.838464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-dns-svc\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:12 crc kubenswrapper[4725]: I0227 06:31:12.856156 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcnl\" (UniqueName: \"kubernetes.io/projected/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-kube-api-access-gqcnl\") pod \"dnsmasq-dns-5d57f9cf89-kzkgz\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:13 crc kubenswrapper[4725]: I0227 06:31:13.069038 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:13 crc kubenswrapper[4725]: I0227 06:31:13.541437 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 06:31:13 crc kubenswrapper[4725]: I0227 06:31:13.600068 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d57f9cf89-kzkgz"] Feb 27 06:31:13 crc kubenswrapper[4725]: W0227 06:31:13.600871 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bc3d70b_4e1a_4ceb_8e33_f59d2d6d238b.slice/crio-54d798aa6577cf86494e00d52af080adf5ec7ea8057e56c0c3771f008fb62e0a WatchSource:0}: Error finding container 54d798aa6577cf86494e00d52af080adf5ec7ea8057e56c0c3771f008fb62e0a: Status 404 returned error can't find the container with id 54d798aa6577cf86494e00d52af080adf5ec7ea8057e56c0c3771f008fb62e0a Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.262907 4725 generic.go:334] "Generic (PLEG): container finished" podID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerID="80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6" exitCode=0 Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.266060 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" event={"ID":"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b","Type":"ContainerDied","Data":"80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6"} Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.266735 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" event={"ID":"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b","Type":"ContainerStarted","Data":"54d798aa6577cf86494e00d52af080adf5ec7ea8057e56c0c3771f008fb62e0a"} Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.289683 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vvwlj" event={"ID":"df17b144-75a9-44a8-a2b4-4694687dc01f","Type":"ContainerStarted","Data":"0eb7dcd26fc725bba4ccc650386b5e042f0ad7517425afbe2b83666767456303"} Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.297197 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"61c1321690f35838b8c095d03ee66e656632dab55ce4006683dcf1251189736d"} Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.300919 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-c7txp" event={"ID":"be03cf3c-5ffa-40cd-9a69-cb386068bc2c","Type":"ContainerStarted","Data":"fa80896acee4d9c5698973586823ba4b9b7144dc91d808d9a17bad162b28c1d0"} Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.322515 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vvwlj" podStartSLOduration=2.325820157 podStartE2EDuration="13.322496938s" podCreationTimestamp="2026-02-27 06:31:01 +0000 UTC" firstStartedPulling="2026-02-27 06:31:02.053589377 +0000 UTC m=+1240.516209946" lastFinishedPulling="2026-02-27 06:31:13.050266148 +0000 UTC m=+1251.512886727" observedRunningTime="2026-02-27 06:31:14.317716264 +0000 UTC m=+1252.780336833" watchObservedRunningTime="2026-02-27 06:31:14.322496938 +0000 UTC m=+1252.785117507" Feb 27 06:31:14 crc kubenswrapper[4725]: I0227 06:31:14.338342 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-c7txp" podStartSLOduration=3.141676252 podStartE2EDuration="13.338322733s" podCreationTimestamp="2026-02-27 06:31:01 +0000 UTC" firstStartedPulling="2026-02-27 06:31:02.873102055 +0000 UTC m=+1241.335722624" lastFinishedPulling="2026-02-27 06:31:13.069748536 +0000 UTC m=+1251.532369105" observedRunningTime="2026-02-27 06:31:14.335683319 +0000 UTC m=+1252.798303898" watchObservedRunningTime="2026-02-27 06:31:14.338322733 +0000 UTC m=+1252.800943302" Feb 27 06:31:15 crc kubenswrapper[4725]: I0227 06:31:15.310850 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" event={"ID":"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b","Type":"ContainerStarted","Data":"181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3"} Feb 27 06:31:15 crc kubenswrapper[4725]: I0227 06:31:15.311144 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:15 crc kubenswrapper[4725]: I0227 06:31:15.315481 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"d182531c141cc9298a40a83b050d342c883c36c97b1288066a485ccb8cbdcc2f"} Feb 27 06:31:15 crc kubenswrapper[4725]: I0227 06:31:15.315527 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"26b7ec55a0150639893491fa26ec0e062a6bc9b46560668dba22cd36064d806c"} Feb 27 06:31:15 crc kubenswrapper[4725]: I0227 06:31:15.315538 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"fca4a0a45d28c5bc367df13a11f541d35d2834cea92c3e87cc7f1d80cdcc08f6"} Feb 27 06:31:15 crc kubenswrapper[4725]: I0227 06:31:15.315547 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"9bb7836be4dee467428ef16c794268d861250dd94f728fd3172cbe62c98ec8ae"} Feb 27 06:31:15 crc kubenswrapper[4725]: I0227 06:31:15.335530 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" podStartSLOduration=3.335511289 podStartE2EDuration="3.335511289s" podCreationTimestamp="2026-02-27 06:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:15.329409807 +0000 UTC m=+1253.792030376" watchObservedRunningTime="2026-02-27 06:31:15.335511289 +0000 UTC m=+1253.798131848" Feb 27 06:31:16 crc kubenswrapper[4725]: I0227 06:31:16.336033 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"c7b1340286739e9aae9f1ed8aad059d2372c6eeb1360338d72013f050ba3028c"} Feb 27 06:31:16 crc kubenswrapper[4725]: I0227 06:31:16.336401 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"99370ea5d26e18d5c2ff15a3a8f78af108215449cdcfada46c01e04e123b0ab3"} Feb 27 06:31:16 crc kubenswrapper[4725]: I0227 06:31:16.336414 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"3f944b36f6c365f02d9a96aa9f09713628cb6572f45bbce3186366c2e7643b43"} Feb 27 06:31:17 crc kubenswrapper[4725]: I0227 06:31:17.348118 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"84d4a26eedf48ef94bd620ae91519e2bf22daf78358187deee82df2cef7e4b4f"} Feb 27 06:31:18 crc kubenswrapper[4725]: I0227 06:31:18.368018 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"3819baefd353a9e25852c4adea4cb834a53af95ca6412c17261879c64abbae0a"} Feb 27 06:31:18 crc kubenswrapper[4725]: I0227 06:31:18.368453 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"0b758f9e72eb737ca144967d55ce42aca72aadcbea766c1350f0ff57bb23b8a0"} Feb 27 06:31:18 crc kubenswrapper[4725]: I0227 06:31:18.368472 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"e89ed9a35d0cf8ddf6134d0e4da5a780ec8b4fcbff1e82f724f6454fbbd31621"} Feb 27 06:31:18 crc kubenswrapper[4725]: I0227 06:31:18.368485 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"366300b527a2752932fbf3528919a1448919993fec3f316daa7debbe9b603e35"} Feb 27 06:31:18 crc kubenswrapper[4725]: I0227 06:31:18.368497 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"0f18bf91dcb0fb5328ad15f3734257b0d70d94befe94e49f9816ce607cb700cb"} Feb 27 06:31:18 crc kubenswrapper[4725]: I0227 06:31:18.370233 4725 generic.go:334] "Generic (PLEG): container finished" podID="be03cf3c-5ffa-40cd-9a69-cb386068bc2c" containerID="fa80896acee4d9c5698973586823ba4b9b7144dc91d808d9a17bad162b28c1d0" exitCode=0 Feb 27 06:31:18 crc kubenswrapper[4725]: I0227 06:31:18.370278 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-c7txp" event={"ID":"be03cf3c-5ffa-40cd-9a69-cb386068bc2c","Type":"ContainerDied","Data":"fa80896acee4d9c5698973586823ba4b9b7144dc91d808d9a17bad162b28c1d0"} Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.385948 4725 generic.go:334] "Generic (PLEG): container finished" podID="df17b144-75a9-44a8-a2b4-4694687dc01f" containerID="0eb7dcd26fc725bba4ccc650386b5e042f0ad7517425afbe2b83666767456303" exitCode=0 Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.386178 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vvwlj" event={"ID":"df17b144-75a9-44a8-a2b4-4694687dc01f","Type":"ContainerDied","Data":"0eb7dcd26fc725bba4ccc650386b5e042f0ad7517425afbe2b83666767456303"} Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.405637 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"ce141e51e6c65c4bdc324c31468a0902bc46b3e4647e12e52b90fdc454a2927f"} Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.405684 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"872eba69-b1d2-4028-b65f-b70fa14daeb0","Type":"ContainerStarted","Data":"5198797fc12f05d49d1c7adc0065ebc8372dd19beb0d7c89b85befda183cd36a"} Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.477842 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=69.568233294 podStartE2EDuration="1m13.477815277s" podCreationTimestamp="2026-02-27 06:30:06 +0000 UTC" firstStartedPulling="2026-02-27 06:31:13.554324914 +0000 UTC m=+1252.016945483" lastFinishedPulling="2026-02-27 06:31:17.463906887 +0000 UTC m=+1255.926527466" observedRunningTime="2026-02-27 06:31:19.467574249 +0000 UTC m=+1257.930194918" watchObservedRunningTime="2026-02-27 06:31:19.477815277 +0000 UTC m=+1257.940435876" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.791595 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d57f9cf89-kzkgz"] Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.792565 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" podUID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerName="dnsmasq-dns" containerID="cri-o://181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3" gracePeriod=10 Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.793423 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.835402 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869c6c7487-hlmx7"] Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.836777 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.841993 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.849105 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869c6c7487-hlmx7"] Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.870265 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.900492 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8h5j\" (UniqueName: \"kubernetes.io/projected/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-kube-api-access-l8h5j\") pod \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.900639 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-db-sync-config-data\") pod \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.900677 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-combined-ca-bundle\") pod \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.900742 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-config-data\") pod \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\" (UID: \"be03cf3c-5ffa-40cd-9a69-cb386068bc2c\") " Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.901095 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-swift-storage-0\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.901143 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-config\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.901190 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-svc\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.901253 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-sb\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.901307 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-nb\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.901394 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jddxk\" (UniqueName: \"kubernetes.io/projected/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-kube-api-access-jddxk\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.906978 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "be03cf3c-5ffa-40cd-9a69-cb386068bc2c" (UID: "be03cf3c-5ffa-40cd-9a69-cb386068bc2c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.909449 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-kube-api-access-l8h5j" (OuterVolumeSpecName: "kube-api-access-l8h5j") pod "be03cf3c-5ffa-40cd-9a69-cb386068bc2c" (UID: "be03cf3c-5ffa-40cd-9a69-cb386068bc2c"). InnerVolumeSpecName "kube-api-access-l8h5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.942914 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be03cf3c-5ffa-40cd-9a69-cb386068bc2c" (UID: "be03cf3c-5ffa-40cd-9a69-cb386068bc2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:19 crc kubenswrapper[4725]: I0227 06:31:19.960555 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-config-data" (OuterVolumeSpecName: "config-data") pod "be03cf3c-5ffa-40cd-9a69-cb386068bc2c" (UID: "be03cf3c-5ffa-40cd-9a69-cb386068bc2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002485 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jddxk\" (UniqueName: \"kubernetes.io/projected/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-kube-api-access-jddxk\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-swift-storage-0\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002601 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-config\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002648 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-svc\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002708 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-sb\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002745 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-nb\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002823 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8h5j\" (UniqueName: \"kubernetes.io/projected/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-kube-api-access-l8h5j\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002839 4725 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002851 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.002862 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be03cf3c-5ffa-40cd-9a69-cb386068bc2c-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.003962 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-swift-storage-0\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.004115 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-nb\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.004674 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-sb\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.004879 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-config\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.008657 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-svc\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.020409 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jddxk\" (UniqueName: \"kubernetes.io/projected/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-kube-api-access-jddxk\") pod \"dnsmasq-dns-869c6c7487-hlmx7\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.109633 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.281926 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.310727 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-dns-svc\") pod \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.310798 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqcnl\" (UniqueName: \"kubernetes.io/projected/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-kube-api-access-gqcnl\") pod \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.310885 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-sb\") pod \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.311014 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-config\") pod \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.311195 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-nb\") pod \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\" (UID: \"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.319480 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-kube-api-access-gqcnl" (OuterVolumeSpecName: "kube-api-access-gqcnl") pod "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" (UID: "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b"). InnerVolumeSpecName "kube-api-access-gqcnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.359113 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-config" (OuterVolumeSpecName: "config") pod "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" (UID: "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.360826 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" (UID: "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.365451 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" (UID: "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.371380 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" (UID: "7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.414277 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.414349 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.414360 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.414368 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqcnl\" (UniqueName: \"kubernetes.io/projected/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-kube-api-access-gqcnl\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.414377 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.418979 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-c7txp" event={"ID":"be03cf3c-5ffa-40cd-9a69-cb386068bc2c","Type":"ContainerDied","Data":"6f3a7524a3feebd93ef76f37651849652d8e7ce27597c0474ef743bc255d7a14"} Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.419028 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3a7524a3feebd93ef76f37651849652d8e7ce27597c0474ef743bc255d7a14" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.419113 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-c7txp" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.424388 4725 generic.go:334] "Generic (PLEG): container finished" podID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerID="181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3" exitCode=0 Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.424776 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" event={"ID":"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b","Type":"ContainerDied","Data":"181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3"} Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.424814 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" event={"ID":"7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b","Type":"ContainerDied","Data":"54d798aa6577cf86494e00d52af080adf5ec7ea8057e56c0c3771f008fb62e0a"} Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.424831 4725 scope.go:117] "RemoveContainer" containerID="181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.424992 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d57f9cf89-kzkgz" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.450047 4725 scope.go:117] "RemoveContainer" containerID="80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.481033 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d57f9cf89-kzkgz"] Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.490813 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d57f9cf89-kzkgz"] Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.512999 4725 scope.go:117] "RemoveContainer" containerID="181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3" Feb 27 06:31:20 crc kubenswrapper[4725]: E0227 06:31:20.513531 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3\": container with ID starting with 181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3 not found: ID does not exist" containerID="181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.513561 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3"} err="failed to get container status \"181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3\": rpc error: code = NotFound desc = could not find container \"181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3\": container with ID starting with 181602e9a0ab70d74e3b7e560115e5b9faeebff49555c1b11ee80014f41b32a3 not found: ID does not exist" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.513582 4725 scope.go:117] "RemoveContainer" containerID="80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6" Feb 27 06:31:20 crc kubenswrapper[4725]: E0227 06:31:20.514818 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6\": container with ID starting with 80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6 not found: ID does not exist" containerID="80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.514865 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6"} err="failed to get container status \"80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6\": rpc error: code = NotFound desc = could not find container \"80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6\": container with ID starting with 80cedca369a4a6649a90cca9a4c86e9a65baf0d5b9db2eee858f1b1e213a87c6 not found: ID does not exist" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.601050 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869c6c7487-hlmx7"] Feb 27 06:31:20 crc kubenswrapper[4725]: W0227 06:31:20.617243 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4c94c9f_bdfd_4602_830c_cbf52e3c3e5b.slice/crio-40d622a30a59b18606ea08cc97d7b1c4605528972df53b5d66d57bf9fd381e24 WatchSource:0}: Error finding container 40d622a30a59b18606ea08cc97d7b1c4605528972df53b5d66d57bf9fd381e24: Status 404 returned error can't find the container with id 40d622a30a59b18606ea08cc97d7b1c4605528972df53b5d66d57bf9fd381e24 Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.718694 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.819166 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-combined-ca-bundle\") pod \"df17b144-75a9-44a8-a2b4-4694687dc01f\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.819235 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzdtt\" (UniqueName: \"kubernetes.io/projected/df17b144-75a9-44a8-a2b4-4694687dc01f-kube-api-access-fzdtt\") pod \"df17b144-75a9-44a8-a2b4-4694687dc01f\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.819608 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-config-data\") pod \"df17b144-75a9-44a8-a2b4-4694687dc01f\" (UID: \"df17b144-75a9-44a8-a2b4-4694687dc01f\") " Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.823483 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df17b144-75a9-44a8-a2b4-4694687dc01f-kube-api-access-fzdtt" (OuterVolumeSpecName: "kube-api-access-fzdtt") pod "df17b144-75a9-44a8-a2b4-4694687dc01f" (UID: "df17b144-75a9-44a8-a2b4-4694687dc01f"). InnerVolumeSpecName "kube-api-access-fzdtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.858526 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df17b144-75a9-44a8-a2b4-4694687dc01f" (UID: "df17b144-75a9-44a8-a2b4-4694687dc01f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.887051 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-config-data" (OuterVolumeSpecName: "config-data") pod "df17b144-75a9-44a8-a2b4-4694687dc01f" (UID: "df17b144-75a9-44a8-a2b4-4694687dc01f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.922502 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.922530 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzdtt\" (UniqueName: \"kubernetes.io/projected/df17b144-75a9-44a8-a2b4-4694687dc01f-kube-api-access-fzdtt\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:20 crc kubenswrapper[4725]: I0227 06:31:20.922541 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df17b144-75a9-44a8-a2b4-4694687dc01f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:21 crc kubenswrapper[4725]: E0227 06:31:21.128234 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4c94c9f_bdfd_4602_830c_cbf52e3c3e5b.slice/crio-conmon-707b34b0f9916f77e2485da90a5457025742c8f6fd82f61ca2f41f4798863a84.scope\": RecentStats: unable to find data in memory cache]" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.440835 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vvwlj" event={"ID":"df17b144-75a9-44a8-a2b4-4694687dc01f","Type":"ContainerDied","Data":"db1e5f7c7578a0158f307ac480f2ad96887cbe37373fe12f0297813d754c1a67"} Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.440903 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db1e5f7c7578a0158f307ac480f2ad96887cbe37373fe12f0297813d754c1a67" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.443367 4725 generic.go:334] "Generic (PLEG): container finished" podID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerID="707b34b0f9916f77e2485da90a5457025742c8f6fd82f61ca2f41f4798863a84" exitCode=0 Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.443409 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" event={"ID":"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b","Type":"ContainerDied","Data":"707b34b0f9916f77e2485da90a5457025742c8f6fd82f61ca2f41f4798863a84"} Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.443441 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" event={"ID":"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b","Type":"ContainerStarted","Data":"40d622a30a59b18606ea08cc97d7b1c4605528972df53b5d66d57bf9fd381e24"} Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.444260 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vvwlj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.636198 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869c6c7487-hlmx7"] Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.673745 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b657dbc79-2cdjj"] Feb 27 06:31:21 crc kubenswrapper[4725]: E0227 06:31:21.674195 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17b144-75a9-44a8-a2b4-4694687dc01f" containerName="keystone-db-sync" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.674218 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17b144-75a9-44a8-a2b4-4694687dc01f" containerName="keystone-db-sync" Feb 27 06:31:21 crc kubenswrapper[4725]: E0227 06:31:21.674230 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerName="dnsmasq-dns" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.674239 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerName="dnsmasq-dns" Feb 27 06:31:21 crc kubenswrapper[4725]: E0227 06:31:21.674256 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerName="init" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.674265 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerName="init" Feb 27 06:31:21 crc kubenswrapper[4725]: E0227 06:31:21.674309 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be03cf3c-5ffa-40cd-9a69-cb386068bc2c" containerName="watcher-db-sync" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.674319 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="be03cf3c-5ffa-40cd-9a69-cb386068bc2c" containerName="watcher-db-sync" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.674512 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="be03cf3c-5ffa-40cd-9a69-cb386068bc2c" containerName="watcher-db-sync" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.674560 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" containerName="dnsmasq-dns" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.674591 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="df17b144-75a9-44a8-a2b4-4694687dc01f" containerName="keystone-db-sync" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.675700 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.704332 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b657dbc79-2cdjj"] Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.713405 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sjwvj"] Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.714518 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.720028 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4dkt" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.720418 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.720651 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.720730 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.723951 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.724609 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sjwvj"] Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.735772 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-config\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.736184 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-config-data\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.736265 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.736721 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-credential-keys\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.736828 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xf5\" (UniqueName: \"kubernetes.io/projected/6f4a4c70-f6ec-45e4-a981-ee17af923a95-kube-api-access-s8xf5\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.736911 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.736978 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5wsw\" (UniqueName: \"kubernetes.io/projected/8bb9b77c-4b6c-48be-9466-1885010f9ff9-kube-api-access-g5wsw\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.737046 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-scripts\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.737323 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-svc\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.737411 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-fernet-keys\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.737474 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-combined-ca-bundle\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.737527 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839182 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-config\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839242 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-config-data\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839277 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839332 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-credential-keys\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839390 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xf5\" (UniqueName: \"kubernetes.io/projected/6f4a4c70-f6ec-45e4-a981-ee17af923a95-kube-api-access-s8xf5\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839439 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839465 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5wsw\" (UniqueName: \"kubernetes.io/projected/8bb9b77c-4b6c-48be-9466-1885010f9ff9-kube-api-access-g5wsw\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839499 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-scripts\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839556 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-svc\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839589 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-fernet-keys\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839617 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-combined-ca-bundle\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.839647 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.840724 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.841364 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-config\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.843541 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-svc\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.845157 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.845269 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.845894 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-credential-keys\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.851920 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.872838 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.889818 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.893072 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.893360 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-m7t9c" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.909374 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-combined-ca-bundle\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.912078 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-config-data\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.926004 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xf5\" (UniqueName: \"kubernetes.io/projected/6f4a4c70-f6ec-45e4-a981-ee17af923a95-kube-api-access-s8xf5\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.930005 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-fernet-keys\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.930121 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-scripts\") pod \"keystone-bootstrap-sjwvj\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.940281 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.942158 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.945972 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.946046 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq2k5\" (UniqueName: \"kubernetes.io/projected/012808cf-2cf1-4882-b22f-28218fd1a6f8-kube-api-access-dq2k5\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.946117 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/012808cf-2cf1-4882-b22f-28218fd1a6f8-logs\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.946172 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-config-data\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.946219 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.967044 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.967077 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5wsw\" (UniqueName: \"kubernetes.io/projected/8bb9b77c-4b6c-48be-9466-1885010f9ff9-kube-api-access-g5wsw\") pod \"dnsmasq-dns-6b657dbc79-2cdjj\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:21 crc kubenswrapper[4725]: I0227 06:31:21.985089 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.007174 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cc5ddffd5-r9bpn"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.008981 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.014790 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.015201 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.015587 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-tfwst" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.015750 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.025683 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.048304 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.048358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.048393 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq2k5\" (UniqueName: \"kubernetes.io/projected/012808cf-2cf1-4882-b22f-28218fd1a6f8-kube-api-access-dq2k5\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.048447 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/012808cf-2cf1-4882-b22f-28218fd1a6f8-logs\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.048492 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-config-data\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.051105 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.052357 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.053324 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/012808cf-2cf1-4882-b22f-28218fd1a6f8-logs\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.056007 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.057302 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.070073 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.087999 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bvhmj"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.089249 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.091930 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.107828 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mpctd" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.108051 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.108214 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.115633 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq2k5\" (UniqueName: \"kubernetes.io/projected/012808cf-2cf1-4882-b22f-28218fd1a6f8-kube-api-access-dq2k5\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.116427 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-config-data\") pod \"watcher-api-0\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.134381 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.153870 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bvhmj"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.160572 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cc5ddffd5-r9bpn"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.157158 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-logs\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.160745 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.160822 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4dc45b9-7d75-443c-8712-44dca955b02d-logs\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.160950 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q25zn\" (UniqueName: \"kubernetes.io/projected/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-kube-api-access-q25zn\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161050 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-scripts\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161151 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161226 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pbt8\" (UniqueName: \"kubernetes.io/projected/98e9aa25-5670-466b-92a2-26b711b3ccf4-kube-api-access-9pbt8\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161356 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161421 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm9vq\" (UniqueName: \"kubernetes.io/projected/a4dc45b9-7d75-443c-8712-44dca955b02d-kube-api-access-sm9vq\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161523 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-horizon-secret-key\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161595 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-config-data\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161709 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161783 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98e9aa25-5670-466b-92a2-26b711b3ccf4-logs\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.161880 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-config-data\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.200523 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fxbvl"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.201577 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.211399 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.211716 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.211925 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-n8hxg" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.215702 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-5v8pq"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.216885 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.221066 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5kxt8" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.221300 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.221406 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272418 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-config\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272478 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272498 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98e9aa25-5670-466b-92a2-26b711b3ccf4-logs\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272545 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7kds\" (UniqueName: \"kubernetes.io/projected/ab103cbe-a833-4f47-8101-d9ea92afe59c-kube-api-access-v7kds\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272570 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-config-data\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272699 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-logs\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272715 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272733 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4dc45b9-7d75-443c-8712-44dca955b02d-logs\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272788 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q25zn\" (UniqueName: \"kubernetes.io/projected/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-kube-api-access-q25zn\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272827 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-scripts\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272863 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272895 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pbt8\" (UniqueName: \"kubernetes.io/projected/98e9aa25-5670-466b-92a2-26b711b3ccf4-kube-api-access-9pbt8\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272913 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-combined-ca-bundle\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272942 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272956 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm9vq\" (UniqueName: \"kubernetes.io/projected/a4dc45b9-7d75-443c-8712-44dca955b02d-kube-api-access-sm9vq\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.272999 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-horizon-secret-key\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.273015 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-config-data\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.273447 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4dc45b9-7d75-443c-8712-44dca955b02d-logs\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.274062 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98e9aa25-5670-466b-92a2-26b711b3ccf4-logs\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.274339 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-logs\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.278577 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.282980 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.287420 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.288822 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.288991 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.289105 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.289258 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.289382 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.296028 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-config-data\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.296243 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-scripts\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.299198 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-config-data\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.310212 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b" path="/var/lib/kubelet/pods/7bc3d70b-4e1a-4ceb-8e33-f59d2d6d238b/volumes" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.310791 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5v8pq"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.311535 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.312017 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q25zn\" (UniqueName: \"kubernetes.io/projected/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-kube-api-access-q25zn\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.313673 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-horizon-secret-key\") pod \"horizon-7cc5ddffd5-r9bpn\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.323470 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fxbvl"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.327373 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm9vq\" (UniqueName: \"kubernetes.io/projected/a4dc45b9-7d75-443c-8712-44dca955b02d-kube-api-access-sm9vq\") pod \"watcher-applier-0\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.332135 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pbt8\" (UniqueName: \"kubernetes.io/projected/98e9aa25-5670-466b-92a2-26b711b3ccf4-kube-api-access-9pbt8\") pod \"watcher-decision-engine-0\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.332177 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-m7t9c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.338486 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.363964 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375186 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b95598a-2902-4372-b9f4-a40152f1c45f-logs\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375224 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a306be2-547c-404f-afc1-4f4639cf7a28-etc-machine-id\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375249 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-combined-ca-bundle\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-combined-ca-bundle\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375325 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-combined-ca-bundle\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375348 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-db-sync-config-data\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375368 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmj4c\" (UniqueName: \"kubernetes.io/projected/6a306be2-547c-404f-afc1-4f4639cf7a28-kube-api-access-bmj4c\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375390 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-scripts\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375414 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-config\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375443 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7kds\" (UniqueName: \"kubernetes.io/projected/ab103cbe-a833-4f47-8101-d9ea92afe59c-kube-api-access-v7kds\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375468 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86lsq\" (UniqueName: \"kubernetes.io/projected/4b95598a-2902-4372-b9f4-a40152f1c45f-kube-api-access-86lsq\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375505 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-scripts\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375534 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-config-data\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.375553 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-config-data\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.376751 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.382329 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-594f5b48d5-4gn6c"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.383759 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.387942 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.394112 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-combined-ca-bundle\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.398359 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b657dbc79-2cdjj"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.412161 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.412344 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.412450 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.412782 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-594f5b48d5-4gn6c"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.417403 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-config\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.428917 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.476049 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-tfwst" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.477991 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489372 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-config-data\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489432 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-log-httpd\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489482 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-config-data\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489542 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-scripts\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489593 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-config-data\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489649 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-config-data\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489693 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b95598a-2902-4372-b9f4-a40152f1c45f-logs\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489713 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a306be2-547c-404f-afc1-4f4639cf7a28-etc-machine-id\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489746 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-combined-ca-bundle\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489766 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-run-httpd\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489839 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-combined-ca-bundle\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489921 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-scripts\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.489953 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-scripts\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490102 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-db-sync-config-data\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490174 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmj4c\" (UniqueName: \"kubernetes.io/projected/6a306be2-547c-404f-afc1-4f4639cf7a28-kube-api-access-bmj4c\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490198 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xkx\" (UniqueName: \"kubernetes.io/projected/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-kube-api-access-p8xkx\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490414 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-logs\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490483 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-scripts\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490544 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-horizon-secret-key\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490591 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490649 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp94q\" (UniqueName: \"kubernetes.io/projected/f817188c-5563-4b93-abe7-94305a5c95a9-kube-api-access-bp94q\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.490760 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86lsq\" (UniqueName: \"kubernetes.io/projected/4b95598a-2902-4372-b9f4-a40152f1c45f-kube-api-access-86lsq\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.491196 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.494261 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b95598a-2902-4372-b9f4-a40152f1c45f-logs\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.496191 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a306be2-547c-404f-afc1-4f4639cf7a28-etc-machine-id\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.518249 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-db-sync-config-data\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.530032 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-config-data\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.536628 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-scripts\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.540519 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-scripts\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.541356 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7kds\" (UniqueName: \"kubernetes.io/projected/ab103cbe-a833-4f47-8101-d9ea92afe59c-kube-api-access-v7kds\") pod \"neutron-db-sync-bvhmj\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.545039 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-combined-ca-bundle\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.552586 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-config-data\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.559380 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86lsq\" (UniqueName: \"kubernetes.io/projected/4b95598a-2902-4372-b9f4-a40152f1c45f-kube-api-access-86lsq\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.609616 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmj4c\" (UniqueName: \"kubernetes.io/projected/6a306be2-547c-404f-afc1-4f4639cf7a28-kube-api-access-bmj4c\") pod \"cinder-db-sync-5v8pq\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.629588 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-scripts\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.630148 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.630216 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-scripts\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.630735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xkx\" (UniqueName: \"kubernetes.io/projected/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-kube-api-access-p8xkx\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.630779 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-logs\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.630826 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-horizon-secret-key\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.630842 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.632583 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp94q\" (UniqueName: \"kubernetes.io/projected/f817188c-5563-4b93-abe7-94305a5c95a9-kube-api-access-bp94q\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.632716 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-config-data\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.632754 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-log-httpd\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.632779 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-config-data\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.632872 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-run-httpd\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.632964 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-scripts\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.632423 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-logs\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.643119 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7767444847-ftb89"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.643173 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-combined-ca-bundle\") pod \"placement-db-sync-fxbvl\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.644803 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.645848 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.647147 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-scripts\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.647369 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.649240 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-horizon-secret-key\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.654267 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-log-httpd\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.655158 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-run-httpd\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.657747 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-config-data\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.665334 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-config-data\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.667210 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xkx\" (UniqueName: \"kubernetes.io/projected/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-kube-api-access-p8xkx\") pod \"horizon-594f5b48d5-4gn6c\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.674917 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" event={"ID":"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b","Type":"ContainerStarted","Data":"922a4a3b25b20bcc222240c5c0201297652b4de05a6d950d27c89aa8b0e79cda"} Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.675086 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" podUID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerName="dnsmasq-dns" containerID="cri-o://922a4a3b25b20bcc222240c5c0201297652b4de05a6d950d27c89aa8b0e79cda" gracePeriod=10 Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.675373 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.714357 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7767444847-ftb89"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.715268 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp94q\" (UniqueName: \"kubernetes.io/projected/f817188c-5563-4b93-abe7-94305a5c95a9-kube-api-access-bp94q\") pod \"ceilometer-0\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.727718 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zb7cc"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.729587 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.730836 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.732073 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-98jzd" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.734708 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.744893 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.748961 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.750146 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zb7cc"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.753609 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mg9ls" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.754129 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.757216 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.757381 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.775607 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.792852 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mpctd" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.796224 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.797044 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.822993 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fxbvl" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.845777 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" podStartSLOduration=3.845756127 podStartE2EDuration="3.845756127s" podCreationTimestamp="2026-02-27 06:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:22.714882876 +0000 UTC m=+1261.177503455" watchObservedRunningTime="2026-02-27 06:31:22.845756127 +0000 UTC m=+1261.308376696" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855643 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855691 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bfr\" (UniqueName: \"kubernetes.io/projected/2fb89808-3a91-47ca-8d7a-c96f22784048-kube-api-access-g9bfr\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855742 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-combined-ca-bundle\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855776 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb22r\" (UniqueName: \"kubernetes.io/projected/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-kube-api-access-gb22r\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855806 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855827 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl8fj\" (UniqueName: \"kubernetes.io/projected/617c62fd-dee8-4bab-a69a-8f348c8487a3-kube-api-access-wl8fj\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855952 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-config\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.855988 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856055 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-db-sync-config-data\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856087 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-svc\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856125 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-swift-storage-0\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856146 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-logs\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856165 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-sb\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856208 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856229 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-nb\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.856259 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.871047 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.962983 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963033 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-db-sync-config-data\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963069 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-svc\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963112 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-swift-storage-0\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963140 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-logs\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963161 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-sb\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963210 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963235 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-nb\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.963270 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.964275 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-sb\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.970155 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-swift-storage-0\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.970726 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-logs\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.971368 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-svc\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.972102 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-nb\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.980529 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.980813 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.980849 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bfr\" (UniqueName: \"kubernetes.io/projected/2fb89808-3a91-47ca-8d7a-c96f22784048-kube-api-access-g9bfr\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.980924 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-combined-ca-bundle\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.980977 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb22r\" (UniqueName: \"kubernetes.io/projected/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-kube-api-access-gb22r\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.980988 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.981014 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.981051 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl8fj\" (UniqueName: \"kubernetes.io/projected/617c62fd-dee8-4bab-a69a-8f348c8487a3-kube-api-access-wl8fj\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.981070 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-config\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.981100 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.981613 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.986277 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-db-sync-config-data\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.990054 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-config\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:22 crc kubenswrapper[4725]: I0227 06:31:22.990594 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.000994 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.015053 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl8fj\" (UniqueName: \"kubernetes.io/projected/617c62fd-dee8-4bab-a69a-8f348c8487a3-kube-api-access-wl8fj\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.024541 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-combined-ca-bundle\") pod \"barbican-db-sync-zb7cc\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.024857 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b657dbc79-2cdjj"] Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.026479 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.026944 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb22r\" (UniqueName: \"kubernetes.io/projected/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-kube-api-access-gb22r\") pod \"dnsmasq-dns-7767444847-ftb89\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.035707 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.036947 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bfr\" (UniqueName: \"kubernetes.io/projected/2fb89808-3a91-47ca-8d7a-c96f22784048-kube-api-access-g9bfr\") pod \"glance-default-internal-api-0\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.051755 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.065748 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.067319 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.071073 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.075360 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.085964 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.155205 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sjwvj"] Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.186270 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.186642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-logs\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.186698 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-config-data\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.186782 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czbvr\" (UniqueName: \"kubernetes.io/projected/154e40c2-11b3-4eec-93c1-6dc57202ac90-kube-api-access-czbvr\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.186844 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.186877 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-scripts\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.186920 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.187001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: W0227 06:31:23.227778 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4a4c70_f6ec_45e4_a981_ee17af923a95.slice/crio-8fa7a92e5ade6e1df1fd6678fabbb8b2a9de2bc73d9511f19054ec6af31cbdca WatchSource:0}: Error finding container 8fa7a92e5ade6e1df1fd6678fabbb8b2a9de2bc73d9511f19054ec6af31cbdca: Status 404 returned error can't find the container with id 8fa7a92e5ade6e1df1fd6678fabbb8b2a9de2bc73d9511f19054ec6af31cbdca Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.245434 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.249319 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289171 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289238 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289261 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-logs\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289299 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-config-data\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czbvr\" (UniqueName: \"kubernetes.io/projected/154e40c2-11b3-4eec-93c1-6dc57202ac90-kube-api-access-czbvr\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289399 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289424 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-scripts\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289446 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.289848 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.296922 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.297846 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.298393 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-config-data\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.299002 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.301654 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-logs\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.304500 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.304659 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.310382 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-scripts\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.313438 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czbvr\" (UniqueName: \"kubernetes.io/projected/154e40c2-11b3-4eec-93c1-6dc57202ac90-kube-api-access-czbvr\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.370828 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.576751 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.631524 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:31:23 crc kubenswrapper[4725]: W0227 06:31:23.636792 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4dc45b9_7d75_443c_8712_44dca955b02d.slice/crio-a6ba534cc3da1867c6f02806239f2a692feed875bc3e0e8126850f5e389e100c WatchSource:0}: Error finding container a6ba534cc3da1867c6f02806239f2a692feed875bc3e0e8126850f5e389e100c: Status 404 returned error can't find the container with id a6ba534cc3da1867c6f02806239f2a692feed875bc3e0e8126850f5e389e100c Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.638911 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:31:23 crc kubenswrapper[4725]: W0227 06:31:23.655192 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf817188c_5563_4b93_abe7_94305a5c95a9.slice/crio-5e912678bc8e972388309a7a24d0f78262f9d001cfcdb4ac35954685a13c3fd3 WatchSource:0}: Error finding container 5e912678bc8e972388309a7a24d0f78262f9d001cfcdb4ac35954685a13c3fd3: Status 404 returned error can't find the container with id 5e912678bc8e972388309a7a24d0f78262f9d001cfcdb4ac35954685a13c3fd3 Feb 27 06:31:23 crc kubenswrapper[4725]: W0227 06:31:23.655484 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e9aa25_5670_466b_92a2_26b711b3ccf4.slice/crio-5e03a47d41878c3e27e7745a3f685363d47fc205a380f355a45e39364eb32527 WatchSource:0}: Error finding container 5e03a47d41878c3e27e7745a3f685363d47fc205a380f355a45e39364eb32527: Status 404 returned error can't find the container with id 5e03a47d41878c3e27e7745a3f685363d47fc205a380f355a45e39364eb32527 Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.666155 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cc5ddffd5-r9bpn"] Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.677183 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.686924 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"012808cf-2cf1-4882-b22f-28218fd1a6f8","Type":"ContainerStarted","Data":"3203fc4970aa8badbe3656c13be5cd370d0b539d500047f643e4257cba0c7706"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.686959 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"012808cf-2cf1-4882-b22f-28218fd1a6f8","Type":"ContainerStarted","Data":"62b9e37d02dc57234dc8ad97a6f06d73103b53c3cb42698f498186aced9a469d"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.688187 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sjwvj" event={"ID":"6f4a4c70-f6ec-45e4-a981-ee17af923a95","Type":"ContainerStarted","Data":"af9a5ea31ff62326be7796d50e91aa79a890cd4050b974077a59b842f558133b"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.688206 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sjwvj" event={"ID":"6f4a4c70-f6ec-45e4-a981-ee17af923a95","Type":"ContainerStarted","Data":"8fa7a92e5ade6e1df1fd6678fabbb8b2a9de2bc73d9511f19054ec6af31cbdca"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.709913 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a4dc45b9-7d75-443c-8712-44dca955b02d","Type":"ContainerStarted","Data":"a6ba534cc3da1867c6f02806239f2a692feed875bc3e0e8126850f5e389e100c"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.721556 4725 generic.go:334] "Generic (PLEG): container finished" podID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerID="922a4a3b25b20bcc222240c5c0201297652b4de05a6d950d27c89aa8b0e79cda" exitCode=0 Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.721649 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" event={"ID":"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b","Type":"ContainerDied","Data":"922a4a3b25b20bcc222240c5c0201297652b4de05a6d950d27c89aa8b0e79cda"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.724661 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc5ddffd5-r9bpn" event={"ID":"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd","Type":"ContainerStarted","Data":"737101e1fa91aa437b70cc13dd0baba7cecea4a1fb214a1c6b71527ed42d010f"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.727106 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerStarted","Data":"5e03a47d41878c3e27e7745a3f685363d47fc205a380f355a45e39364eb32527"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.728211 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerStarted","Data":"5e912678bc8e972388309a7a24d0f78262f9d001cfcdb4ac35954685a13c3fd3"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.730763 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" event={"ID":"8bb9b77c-4b6c-48be-9466-1885010f9ff9","Type":"ContainerStarted","Data":"4dea99dd601f4c42c989283f6c4ff5ebe654782d8fbe2ddb47d14cb4748bbeb6"} Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.840545 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-594f5b48d5-4gn6c"] Feb 27 06:31:23 crc kubenswrapper[4725]: W0227 06:31:23.843378 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab103cbe_a833_4f47_8101_d9ea92afe59c.slice/crio-681154434b36c5f6e9d8d1749cb3ab467ae584702fc3afd560bdb9ca63648068 WatchSource:0}: Error finding container 681154434b36c5f6e9d8d1749cb3ab467ae584702fc3afd560bdb9ca63648068: Status 404 returned error can't find the container with id 681154434b36c5f6e9d8d1749cb3ab467ae584702fc3afd560bdb9ca63648068 Feb 27 06:31:23 crc kubenswrapper[4725]: I0227 06:31:23.854755 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bvhmj"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.338861 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.339353 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zb7cc"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.339490 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7767444847-ftb89"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.339569 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fxbvl"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.369408 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.423982 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cc5ddffd5-r9bpn"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.459700 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c8b7f8fb9-vxlzg"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.468979 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.492227 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.502586 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5v8pq"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.551578 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c8b7f8fb9-vxlzg"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.573521 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.588710 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650250 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-config\") pod \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650312 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jddxk\" (UniqueName: \"kubernetes.io/projected/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-kube-api-access-jddxk\") pod \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650358 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-nb\") pod \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650425 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-swift-storage-0\") pod \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650482 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-sb\") pod \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650663 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-svc\") pod \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\" (UID: \"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b\") " Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650865 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-horizon-secret-key\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.650901 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7449x\" (UniqueName: \"kubernetes.io/projected/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-kube-api-access-7449x\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.652673 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-config-data\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.653219 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-scripts\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.653484 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-logs\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.671608 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.678546 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-kube-api-access-jddxk" (OuterVolumeSpecName: "kube-api-access-jddxk") pod "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" (UID: "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b"). InnerVolumeSpecName "kube-api-access-jddxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.681403 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.758337 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-horizon-secret-key\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.758386 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7449x\" (UniqueName: \"kubernetes.io/projected/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-kube-api-access-7449x\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.758450 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-config-data\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.758494 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-scripts\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.758549 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-logs\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.758600 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jddxk\" (UniqueName: \"kubernetes.io/projected/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-kube-api-access-jddxk\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.758917 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-logs\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.772724 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7767444847-ftb89" event={"ID":"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f","Type":"ContainerStarted","Data":"b1a5e717f6c8651f57baa790c087c49ff6e4361ad1c344f6c024b6c13b5f48dd"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.774055 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-config-data\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.774622 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-scripts\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.812378 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7449x\" (UniqueName: \"kubernetes.io/projected/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-kube-api-access-7449x\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.812636 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-horizon-secret-key\") pod \"horizon-7c8b7f8fb9-vxlzg\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.813445 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" (UID: "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.823370 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" (UID: "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.824395 4725 generic.go:334] "Generic (PLEG): container finished" podID="8bb9b77c-4b6c-48be-9466-1885010f9ff9" containerID="5df3056da8b5dd2aff452725f56523096ec90069d499c25196d3039f5d154d1d" exitCode=0 Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.824551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" event={"ID":"8bb9b77c-4b6c-48be-9466-1885010f9ff9","Type":"ContainerDied","Data":"5df3056da8b5dd2aff452725f56523096ec90069d499c25196d3039f5d154d1d"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.839312 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154e40c2-11b3-4eec-93c1-6dc57202ac90","Type":"ContainerStarted","Data":"26b156074375250cfdf07c4b50ec2e9864503952698288c704712cc21d710c00"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.866792 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" event={"ID":"b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b","Type":"ContainerDied","Data":"40d622a30a59b18606ea08cc97d7b1c4605528972df53b5d66d57bf9fd381e24"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.866844 4725 scope.go:117] "RemoveContainer" containerID="922a4a3b25b20bcc222240c5c0201297652b4de05a6d950d27c89aa8b0e79cda" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.866967 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869c6c7487-hlmx7" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.870137 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.870155 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.870955 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2fb89808-3a91-47ca-8d7a-c96f22784048","Type":"ContainerStarted","Data":"9e5ca77deee1b4d02b1d91c18a162b058d877d99dd69328d71237ed12379530a"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.873827 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvhmj" event={"ID":"ab103cbe-a833-4f47-8101-d9ea92afe59c","Type":"ContainerStarted","Data":"8d28ad203b2aeaee1320d261fb493db88dc11595d7154df0162195fba412ad0c"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.873853 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvhmj" event={"ID":"ab103cbe-a833-4f47-8101-d9ea92afe59c","Type":"ContainerStarted","Data":"681154434b36c5f6e9d8d1749cb3ab467ae584702fc3afd560bdb9ca63648068"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.875404 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5v8pq" event={"ID":"6a306be2-547c-404f-afc1-4f4639cf7a28","Type":"ContainerStarted","Data":"4d6cb75cb078aa6e8e144ac051b379e31055dfa2e4c2cfffe56d4f4962e34ac3"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.918093 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" (UID: "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.919048 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"012808cf-2cf1-4882-b22f-28218fd1a6f8","Type":"ContainerStarted","Data":"5580d5c716c4b2fda51f391bc39be540360a330a7b40429acc3e03a22714d877"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.919250 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api-log" containerID="cri-o://3203fc4970aa8badbe3656c13be5cd370d0b539d500047f643e4257cba0c7706" gracePeriod=30 Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.920064 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api" containerID="cri-o://5580d5c716c4b2fda51f391bc39be540360a330a7b40429acc3e03a22714d877" gracePeriod=30 Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.920279 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.923840 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f5b48d5-4gn6c" event={"ID":"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156","Type":"ContainerStarted","Data":"d9df58203eaf1c115de04cae850e3fbf818869900cfcf9ae877c2e8fd9e63c94"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.932035 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zb7cc" event={"ID":"617c62fd-dee8-4bab-a69a-8f348c8487a3","Type":"ContainerStarted","Data":"e0441b3c16d6938635ab6818defdf1ee284cbbcf64407ed5e334948d02c0264b"} Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.933673 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" (UID: "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.943590 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-config" (OuterVolumeSpecName: "config") pod "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" (UID: "b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.972543 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.972571 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.972585 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.983626 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bvhmj" podStartSLOduration=3.9836082619999997 podStartE2EDuration="3.983608262s" podCreationTimestamp="2026-02-27 06:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:24.894737482 +0000 UTC m=+1263.357358051" watchObservedRunningTime="2026-02-27 06:31:24.983608262 +0000 UTC m=+1263.446228831" Feb 27 06:31:24 crc kubenswrapper[4725]: I0227 06:31:24.991516 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fxbvl" event={"ID":"4b95598a-2902-4372-b9f4-a40152f1c45f","Type":"ContainerStarted","Data":"4727e1c073ca2f5af69493d330726c640223c5e8fba93ec9fc13afb251235f57"} Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.023339 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.023317048 podStartE2EDuration="4.023317048s" podCreationTimestamp="2026-02-27 06:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:24.949958405 +0000 UTC m=+1263.412578974" watchObservedRunningTime="2026-02-27 06:31:25.023317048 +0000 UTC m=+1263.485937617" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.035345 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sjwvj" podStartSLOduration=4.035321346 podStartE2EDuration="4.035321346s" podCreationTimestamp="2026-02-27 06:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:25.015729695 +0000 UTC m=+1263.478350254" watchObservedRunningTime="2026-02-27 06:31:25.035321346 +0000 UTC m=+1263.497941915" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.079413 4725 scope.go:117] "RemoveContainer" containerID="707b34b0f9916f77e2485da90a5457025742c8f6fd82f61ca2f41f4798863a84" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.107738 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.229340 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": read tcp 10.217.0.2:46098->10.217.0.153:9322: read: connection reset by peer" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.251566 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869c6c7487-hlmx7"] Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.262341 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869c6c7487-hlmx7"] Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.336246 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.485632 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-nb\") pod \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.485747 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5wsw\" (UniqueName: \"kubernetes.io/projected/8bb9b77c-4b6c-48be-9466-1885010f9ff9-kube-api-access-g5wsw\") pod \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.485770 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-swift-storage-0\") pod \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.485817 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-config\") pod \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.485844 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-svc\") pod \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.485860 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-sb\") pod \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\" (UID: \"8bb9b77c-4b6c-48be-9466-1885010f9ff9\") " Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.495696 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb9b77c-4b6c-48be-9466-1885010f9ff9-kube-api-access-g5wsw" (OuterVolumeSpecName: "kube-api-access-g5wsw") pod "8bb9b77c-4b6c-48be-9466-1885010f9ff9" (UID: "8bb9b77c-4b6c-48be-9466-1885010f9ff9"). InnerVolumeSpecName "kube-api-access-g5wsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.522010 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bb9b77c-4b6c-48be-9466-1885010f9ff9" (UID: "8bb9b77c-4b6c-48be-9466-1885010f9ff9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.523456 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bb9b77c-4b6c-48be-9466-1885010f9ff9" (UID: "8bb9b77c-4b6c-48be-9466-1885010f9ff9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.533114 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bb9b77c-4b6c-48be-9466-1885010f9ff9" (UID: "8bb9b77c-4b6c-48be-9466-1885010f9ff9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.537180 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-config" (OuterVolumeSpecName: "config") pod "8bb9b77c-4b6c-48be-9466-1885010f9ff9" (UID: "8bb9b77c-4b6c-48be-9466-1885010f9ff9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.554762 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bb9b77c-4b6c-48be-9466-1885010f9ff9" (UID: "8bb9b77c-4b6c-48be-9466-1885010f9ff9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.589339 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.589375 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5wsw\" (UniqueName: \"kubernetes.io/projected/8bb9b77c-4b6c-48be-9466-1885010f9ff9-kube-api-access-g5wsw\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.589387 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.589397 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.589406 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.589414 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb9b77c-4b6c-48be-9466-1885010f9ff9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:25 crc kubenswrapper[4725]: I0227 06:31:25.855359 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c8b7f8fb9-vxlzg"] Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.009098 4725 generic.go:334] "Generic (PLEG): container finished" podID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerID="6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200" exitCode=0 Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.009181 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7767444847-ftb89" event={"ID":"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f","Type":"ContainerDied","Data":"6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200"} Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.017380 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" event={"ID":"8bb9b77c-4b6c-48be-9466-1885010f9ff9","Type":"ContainerDied","Data":"4dea99dd601f4c42c989283f6c4ff5ebe654782d8fbe2ddb47d14cb4748bbeb6"} Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.017469 4725 scope.go:117] "RemoveContainer" containerID="5df3056da8b5dd2aff452725f56523096ec90069d499c25196d3039f5d154d1d" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.017602 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b657dbc79-2cdjj" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.032860 4725 generic.go:334] "Generic (PLEG): container finished" podID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerID="5580d5c716c4b2fda51f391bc39be540360a330a7b40429acc3e03a22714d877" exitCode=0 Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.032901 4725 generic.go:334] "Generic (PLEG): container finished" podID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerID="3203fc4970aa8badbe3656c13be5cd370d0b539d500047f643e4257cba0c7706" exitCode=143 Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.032950 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"012808cf-2cf1-4882-b22f-28218fd1a6f8","Type":"ContainerDied","Data":"5580d5c716c4b2fda51f391bc39be540360a330a7b40429acc3e03a22714d877"} Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.032980 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"012808cf-2cf1-4882-b22f-28218fd1a6f8","Type":"ContainerDied","Data":"3203fc4970aa8badbe3656c13be5cd370d0b539d500047f643e4257cba0c7706"} Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.036689 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154e40c2-11b3-4eec-93c1-6dc57202ac90","Type":"ContainerStarted","Data":"13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9"} Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.096004 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b657dbc79-2cdjj"] Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.104799 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b657dbc79-2cdjj"] Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.267989 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb9b77c-4b6c-48be-9466-1885010f9ff9" path="/var/lib/kubelet/pods/8bb9b77c-4b6c-48be-9466-1885010f9ff9/volumes" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.268801 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" path="/var/lib/kubelet/pods/b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b/volumes" Feb 27 06:31:26 crc kubenswrapper[4725]: W0227 06:31:26.548074 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e4ea208_f6a4_40f8_9b4e_14bfece1614d.slice/crio-494a854070209818598a5d8eba6a3e5c28924ce079637de55483da8606787647 WatchSource:0}: Error finding container 494a854070209818598a5d8eba6a3e5c28924ce079637de55483da8606787647: Status 404 returned error can't find the container with id 494a854070209818598a5d8eba6a3e5c28924ce079637de55483da8606787647 Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.711250 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.813859 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-combined-ca-bundle\") pod \"012808cf-2cf1-4882-b22f-28218fd1a6f8\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.814026 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/012808cf-2cf1-4882-b22f-28218fd1a6f8-logs\") pod \"012808cf-2cf1-4882-b22f-28218fd1a6f8\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.814087 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq2k5\" (UniqueName: \"kubernetes.io/projected/012808cf-2cf1-4882-b22f-28218fd1a6f8-kube-api-access-dq2k5\") pod \"012808cf-2cf1-4882-b22f-28218fd1a6f8\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.814103 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-custom-prometheus-ca\") pod \"012808cf-2cf1-4882-b22f-28218fd1a6f8\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.814338 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-config-data\") pod \"012808cf-2cf1-4882-b22f-28218fd1a6f8\" (UID: \"012808cf-2cf1-4882-b22f-28218fd1a6f8\") " Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.835552 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/012808cf-2cf1-4882-b22f-28218fd1a6f8-logs" (OuterVolumeSpecName: "logs") pod "012808cf-2cf1-4882-b22f-28218fd1a6f8" (UID: "012808cf-2cf1-4882-b22f-28218fd1a6f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.871583 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012808cf-2cf1-4882-b22f-28218fd1a6f8-kube-api-access-dq2k5" (OuterVolumeSpecName: "kube-api-access-dq2k5") pod "012808cf-2cf1-4882-b22f-28218fd1a6f8" (UID: "012808cf-2cf1-4882-b22f-28218fd1a6f8"). InnerVolumeSpecName "kube-api-access-dq2k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.883440 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "012808cf-2cf1-4882-b22f-28218fd1a6f8" (UID: "012808cf-2cf1-4882-b22f-28218fd1a6f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.915593 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "012808cf-2cf1-4882-b22f-28218fd1a6f8" (UID: "012808cf-2cf1-4882-b22f-28218fd1a6f8"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.916949 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq2k5\" (UniqueName: \"kubernetes.io/projected/012808cf-2cf1-4882-b22f-28218fd1a6f8-kube-api-access-dq2k5\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.916971 4725 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.916980 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.916988 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/012808cf-2cf1-4882-b22f-28218fd1a6f8-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:26 crc kubenswrapper[4725]: I0227 06:31:26.927331 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-config-data" (OuterVolumeSpecName: "config-data") pod "012808cf-2cf1-4882-b22f-28218fd1a6f8" (UID: "012808cf-2cf1-4882-b22f-28218fd1a6f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.018719 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/012808cf-2cf1-4882-b22f-28218fd1a6f8-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.046525 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"012808cf-2cf1-4882-b22f-28218fd1a6f8","Type":"ContainerDied","Data":"62b9e37d02dc57234dc8ad97a6f06d73103b53c3cb42698f498186aced9a469d"} Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.046568 4725 scope.go:117] "RemoveContainer" containerID="5580d5c716c4b2fda51f391bc39be540360a330a7b40429acc3e03a22714d877" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.046644 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.056495 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8b7f8fb9-vxlzg" event={"ID":"3e4ea208-f6a4-40f8-9b4e-14bfece1614d","Type":"ContainerStarted","Data":"494a854070209818598a5d8eba6a3e5c28924ce079637de55483da8606787647"} Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.059095 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2fb89808-3a91-47ca-8d7a-c96f22784048","Type":"ContainerStarted","Data":"8702692fe9211a5826a801eaf4c11c4ebc38030ec3e2ac5a14ff8405b4b6f679"} Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.086828 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.106903 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.134553 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:27 crc kubenswrapper[4725]: E0227 06:31:27.136003 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerName="init" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136045 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerName="init" Feb 27 06:31:27 crc kubenswrapper[4725]: E0227 06:31:27.136069 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerName="dnsmasq-dns" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136076 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerName="dnsmasq-dns" Feb 27 06:31:27 crc kubenswrapper[4725]: E0227 06:31:27.136104 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb9b77c-4b6c-48be-9466-1885010f9ff9" containerName="init" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136110 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb9b77c-4b6c-48be-9466-1885010f9ff9" containerName="init" Feb 27 06:31:27 crc kubenswrapper[4725]: E0227 06:31:27.136130 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api-log" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136137 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api-log" Feb 27 06:31:27 crc kubenswrapper[4725]: E0227 06:31:27.136156 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136166 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136531 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb9b77c-4b6c-48be-9466-1885010f9ff9" containerName="init" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136551 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136598 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" containerName="watcher-api-log" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.136615 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c94c9f-bdfd-4602-830c-cbf52e3c3e5b" containerName="dnsmasq-dns" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.138359 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.169030 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.169451 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.226950 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9458\" (UniqueName: \"kubernetes.io/projected/852f1c61-8d84-42fd-bed8-be55f65b3a4c-kube-api-access-v9458\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.227056 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-config-data\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.227079 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852f1c61-8d84-42fd-bed8-be55f65b3a4c-logs\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.227168 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.227190 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.328626 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9458\" (UniqueName: \"kubernetes.io/projected/852f1c61-8d84-42fd-bed8-be55f65b3a4c-kube-api-access-v9458\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.328736 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-config-data\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.328765 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852f1c61-8d84-42fd-bed8-be55f65b3a4c-logs\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.328821 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.328843 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.329476 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852f1c61-8d84-42fd-bed8-be55f65b3a4c-logs\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.332630 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-config-data\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.333979 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.334394 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.358089 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9458\" (UniqueName: \"kubernetes.io/projected/852f1c61-8d84-42fd-bed8-be55f65b3a4c-kube-api-access-v9458\") pod \"watcher-api-0\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.492752 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:31:27 crc kubenswrapper[4725]: I0227 06:31:27.861893 4725 scope.go:117] "RemoveContainer" containerID="3203fc4970aa8badbe3656c13be5cd370d0b539d500047f643e4257cba0c7706" Feb 27 06:31:28 crc kubenswrapper[4725]: I0227 06:31:28.267099 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="012808cf-2cf1-4882-b22f-28218fd1a6f8" path="/var/lib/kubelet/pods/012808cf-2cf1-4882-b22f-28218fd1a6f8/volumes" Feb 27 06:31:29 crc kubenswrapper[4725]: I0227 06:31:29.095471 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7767444847-ftb89" event={"ID":"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f","Type":"ContainerStarted","Data":"e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd"} Feb 27 06:31:29 crc kubenswrapper[4725]: I0227 06:31:29.095975 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:29 crc kubenswrapper[4725]: I0227 06:31:29.118430 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7767444847-ftb89" podStartSLOduration=7.118410699 podStartE2EDuration="7.118410699s" podCreationTimestamp="2026-02-27 06:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:29.110310621 +0000 UTC m=+1267.572931210" watchObservedRunningTime="2026-02-27 06:31:29.118410699 +0000 UTC m=+1267.581031278" Feb 27 06:31:30 crc kubenswrapper[4725]: I0227 06:31:30.110909 4725 generic.go:334] "Generic (PLEG): container finished" podID="6f4a4c70-f6ec-45e4-a981-ee17af923a95" containerID="af9a5ea31ff62326be7796d50e91aa79a890cd4050b974077a59b842f558133b" exitCode=0 Feb 27 06:31:30 crc kubenswrapper[4725]: I0227 06:31:30.111022 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sjwvj" event={"ID":"6f4a4c70-f6ec-45e4-a981-ee17af923a95","Type":"ContainerDied","Data":"af9a5ea31ff62326be7796d50e91aa79a890cd4050b974077a59b842f558133b"} Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.132423 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2fb89808-3a91-47ca-8d7a-c96f22784048","Type":"ContainerStarted","Data":"659f36e0a32a2c589513eea645018110aebafdfd17701bdc5e62ce81025d429d"} Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.132610 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-log" containerID="cri-o://8702692fe9211a5826a801eaf4c11c4ebc38030ec3e2ac5a14ff8405b4b6f679" gracePeriod=30 Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.133196 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-httpd" containerID="cri-o://659f36e0a32a2c589513eea645018110aebafdfd17701bdc5e62ce81025d429d" gracePeriod=30 Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.163928 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.163903946 podStartE2EDuration="9.163903946s" podCreationTimestamp="2026-02-27 06:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:31.154115111 +0000 UTC m=+1269.616735700" watchObservedRunningTime="2026-02-27 06:31:31.163903946 +0000 UTC m=+1269.626524515" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.455204 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594f5b48d5-4gn6c"] Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.484606 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56865cdb4-9hs85"] Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.487367 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.494312 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.509889 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56865cdb4-9hs85"] Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.544343 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c8b7f8fb9-vxlzg"] Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.576633 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f478fcd58-cfjzp"] Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.578939 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.605871 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f478fcd58-cfjzp"] Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.612379 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4fc5fc3-880a-46c5-a0a1-3248884d9882-logs\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.612457 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsmq\" (UniqueName: \"kubernetes.io/projected/a4fc5fc3-880a-46c5-a0a1-3248884d9882-kube-api-access-vqsmq\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.612484 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-config-data\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.612505 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-scripts\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.612525 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-combined-ca-bundle\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.612561 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-secret-key\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.612593 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-tls-certs\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.713982 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/372d4de4-ea8f-4393-af8b-1139e593ac16-scripts\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714032 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-tls-certs\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714194 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372d4de4-ea8f-4393-af8b-1139e593ac16-logs\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714277 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4fc5fc3-880a-46c5-a0a1-3248884d9882-logs\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714446 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsmq\" (UniqueName: \"kubernetes.io/projected/a4fc5fc3-880a-46c5-a0a1-3248884d9882-kube-api-access-vqsmq\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714493 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-config-data\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714527 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-scripts\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-combined-ca-bundle\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714577 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-combined-ca-bundle\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714595 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkftn\" (UniqueName: \"kubernetes.io/projected/372d4de4-ea8f-4393-af8b-1139e593ac16-kube-api-access-xkftn\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714617 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-horizon-tls-certs\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714641 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/372d4de4-ea8f-4393-af8b-1139e593ac16-config-data\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714709 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-secret-key\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714762 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-horizon-secret-key\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.714758 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4fc5fc3-880a-46c5-a0a1-3248884d9882-logs\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.715270 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-scripts\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.716057 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-config-data\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.720274 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-secret-key\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.721387 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-tls-certs\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.724652 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-combined-ca-bundle\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.737969 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsmq\" (UniqueName: \"kubernetes.io/projected/a4fc5fc3-880a-46c5-a0a1-3248884d9882-kube-api-access-vqsmq\") pod \"horizon-56865cdb4-9hs85\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816173 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-combined-ca-bundle\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816214 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkftn\" (UniqueName: \"kubernetes.io/projected/372d4de4-ea8f-4393-af8b-1139e593ac16-kube-api-access-xkftn\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816236 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-horizon-tls-certs\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816257 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/372d4de4-ea8f-4393-af8b-1139e593ac16-config-data\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816405 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-horizon-secret-key\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816427 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/372d4de4-ea8f-4393-af8b-1139e593ac16-scripts\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816488 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372d4de4-ea8f-4393-af8b-1139e593ac16-logs\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.816860 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372d4de4-ea8f-4393-af8b-1139e593ac16-logs\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.817977 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/372d4de4-ea8f-4393-af8b-1139e593ac16-config-data\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.818385 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/372d4de4-ea8f-4393-af8b-1139e593ac16-scripts\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.820085 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-horizon-tls-certs\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.820881 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-combined-ca-bundle\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.821210 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/372d4de4-ea8f-4393-af8b-1139e593ac16-horizon-secret-key\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.836731 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.837793 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkftn\" (UniqueName: \"kubernetes.io/projected/372d4de4-ea8f-4393-af8b-1139e593ac16-kube-api-access-xkftn\") pod \"horizon-7f478fcd58-cfjzp\" (UID: \"372d4de4-ea8f-4393-af8b-1139e593ac16\") " pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:31 crc kubenswrapper[4725]: I0227 06:31:31.896358 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:31:32 crc kubenswrapper[4725]: I0227 06:31:32.164972 4725 generic.go:334] "Generic (PLEG): container finished" podID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerID="659f36e0a32a2c589513eea645018110aebafdfd17701bdc5e62ce81025d429d" exitCode=0 Feb 27 06:31:32 crc kubenswrapper[4725]: I0227 06:31:32.165003 4725 generic.go:334] "Generic (PLEG): container finished" podID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerID="8702692fe9211a5826a801eaf4c11c4ebc38030ec3e2ac5a14ff8405b4b6f679" exitCode=143 Feb 27 06:31:32 crc kubenswrapper[4725]: I0227 06:31:32.165023 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2fb89808-3a91-47ca-8d7a-c96f22784048","Type":"ContainerDied","Data":"659f36e0a32a2c589513eea645018110aebafdfd17701bdc5e62ce81025d429d"} Feb 27 06:31:32 crc kubenswrapper[4725]: I0227 06:31:32.165046 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2fb89808-3a91-47ca-8d7a-c96f22784048","Type":"ContainerDied","Data":"8702692fe9211a5826a801eaf4c11c4ebc38030ec3e2ac5a14ff8405b4b6f679"} Feb 27 06:31:33 crc kubenswrapper[4725]: I0227 06:31:33.307494 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:31:33 crc kubenswrapper[4725]: I0227 06:31:33.370984 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444c9d757-pr69s"] Feb 27 06:31:33 crc kubenswrapper[4725]: I0227 06:31:33.371720 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" containerID="cri-o://c83f0cda61e0eeca3c659f093826b055b5486959ac3ce506b78bc7b5675e6502" gracePeriod=10 Feb 27 06:31:34 crc kubenswrapper[4725]: I0227 06:31:34.182597 4725 generic.go:334] "Generic (PLEG): container finished" podID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerID="c83f0cda61e0eeca3c659f093826b055b5486959ac3ce506b78bc7b5675e6502" exitCode=0 Feb 27 06:31:34 crc kubenswrapper[4725]: I0227 06:31:34.182642 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" event={"ID":"aa2e7bcc-3655-4f34-8f6b-1cc325681122","Type":"ContainerDied","Data":"c83f0cda61e0eeca3c659f093826b055b5486959ac3ce506b78bc7b5675e6502"} Feb 27 06:31:34 crc kubenswrapper[4725]: I0227 06:31:34.927179 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Feb 27 06:31:39 crc kubenswrapper[4725]: I0227 06:31:39.927437 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Feb 27 06:31:40 crc kubenswrapper[4725]: E0227 06:31:40.845283 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 27 06:31:40 crc kubenswrapper[4725]: E0227 06:31:40.845402 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 27 06:31:40 crc kubenswrapper[4725]: E0227 06:31:40.845585 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c5h5d7hd5h57fh66bh94h556hdbh5fdh87h599h57fh576h5b9hd4hd4h5d7h585h54fh656h79h5chbbhfh78h69h5c5h58fh546h58ch97hb6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7449x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7c8b7f8fb9-vxlzg_openstack(3e4ea208-f6a4-40f8-9b4e-14bfece1614d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:31:41 crc kubenswrapper[4725]: E0227 06:31:41.057586 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-7c8b7f8fb9-vxlzg" podUID="3e4ea208-f6a4-40f8-9b4e-14bfece1614d" Feb 27 06:31:41 crc kubenswrapper[4725]: E0227 06:31:41.090516 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 27 06:31:41 crc kubenswrapper[4725]: E0227 06:31:41.090613 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 27 06:31:41 crc kubenswrapper[4725]: E0227 06:31:41.090810 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfchb4h5ch5f8h599h546h646h699h565h5f8hb6h58dh86h59bh64h67bh554h58fh584h655h57fh85h58h5ch585h57ch69h56ch54h59dh66ch5b6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8xkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-594f5b48d5-4gn6c_openstack(b9bc5d37-f0f1-4aee-a7d3-040b6c53b156): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:31:41 crc kubenswrapper[4725]: E0227 06:31:41.093796 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-594f5b48d5-4gn6c" podUID="b9bc5d37-f0f1-4aee-a7d3-040b6c53b156" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.679973 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.688764 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-credential-keys\") pod \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.688883 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-fernet-keys\") pod \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.688946 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-scripts\") pod \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.689029 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-combined-ca-bundle\") pod \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.689177 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-config-data\") pod \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.689227 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8xf5\" (UniqueName: \"kubernetes.io/projected/6f4a4c70-f6ec-45e4-a981-ee17af923a95-kube-api-access-s8xf5\") pod \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\" (UID: \"6f4a4c70-f6ec-45e4-a981-ee17af923a95\") " Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.693432 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6f4a4c70-f6ec-45e4-a981-ee17af923a95" (UID: "6f4a4c70-f6ec-45e4-a981-ee17af923a95"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.693595 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6f4a4c70-f6ec-45e4-a981-ee17af923a95" (UID: "6f4a4c70-f6ec-45e4-a981-ee17af923a95"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.699322 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4a4c70-f6ec-45e4-a981-ee17af923a95-kube-api-access-s8xf5" (OuterVolumeSpecName: "kube-api-access-s8xf5") pod "6f4a4c70-f6ec-45e4-a981-ee17af923a95" (UID: "6f4a4c70-f6ec-45e4-a981-ee17af923a95"). InnerVolumeSpecName "kube-api-access-s8xf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.700245 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-scripts" (OuterVolumeSpecName: "scripts") pod "6f4a4c70-f6ec-45e4-a981-ee17af923a95" (UID: "6f4a4c70-f6ec-45e4-a981-ee17af923a95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.745467 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-config-data" (OuterVolumeSpecName: "config-data") pod "6f4a4c70-f6ec-45e4-a981-ee17af923a95" (UID: "6f4a4c70-f6ec-45e4-a981-ee17af923a95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.755946 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f4a4c70-f6ec-45e4-a981-ee17af923a95" (UID: "6f4a4c70-f6ec-45e4-a981-ee17af923a95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.791466 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.791501 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8xf5\" (UniqueName: \"kubernetes.io/projected/6f4a4c70-f6ec-45e4-a981-ee17af923a95-kube-api-access-s8xf5\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.791514 4725 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.791523 4725 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.791532 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:42 crc kubenswrapper[4725]: I0227 06:31:42.791541 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4a4c70-f6ec-45e4-a981-ee17af923a95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.012489 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.288561 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sjwvj" event={"ID":"6f4a4c70-f6ec-45e4-a981-ee17af923a95","Type":"ContainerDied","Data":"8fa7a92e5ade6e1df1fd6678fabbb8b2a9de2bc73d9511f19054ec6af31cbdca"} Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.288604 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa7a92e5ade6e1df1fd6678fabbb8b2a9de2bc73d9511f19054ec6af31cbdca" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.288617 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sjwvj" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.794791 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sjwvj"] Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.804554 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sjwvj"] Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.895928 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-c7fnv"] Feb 27 06:31:43 crc kubenswrapper[4725]: E0227 06:31:43.896598 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4a4c70-f6ec-45e4-a981-ee17af923a95" containerName="keystone-bootstrap" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.896622 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4a4c70-f6ec-45e4-a981-ee17af923a95" containerName="keystone-bootstrap" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.896855 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4a4c70-f6ec-45e4-a981-ee17af923a95" containerName="keystone-bootstrap" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.897618 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.903544 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.903875 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4dkt" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.903990 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.904111 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.904213 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.909125 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-fernet-keys\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.909183 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-scripts\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.909206 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-combined-ca-bundle\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.909224 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-credential-keys\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.909260 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbw5d\" (UniqueName: \"kubernetes.io/projected/9ab5820b-1151-4cc9-ae7b-09b596335d88-kube-api-access-kbw5d\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.909313 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-config-data\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:43 crc kubenswrapper[4725]: I0227 06:31:43.931376 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7fnv"] Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.010991 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-fernet-keys\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.011082 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-scripts\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.011110 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-combined-ca-bundle\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.011136 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-credential-keys\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.011187 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbw5d\" (UniqueName: \"kubernetes.io/projected/9ab5820b-1151-4cc9-ae7b-09b596335d88-kube-api-access-kbw5d\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.011242 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-config-data\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.017390 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-combined-ca-bundle\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.017533 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-config-data\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.017777 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-fernet-keys\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.018737 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-scripts\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.018910 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-credential-keys\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.030923 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbw5d\" (UniqueName: \"kubernetes.io/projected/9ab5820b-1151-4cc9-ae7b-09b596335d88-kube-api-access-kbw5d\") pod \"keystone-bootstrap-c7fnv\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.218522 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:31:44 crc kubenswrapper[4725]: I0227 06:31:44.262357 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4a4c70-f6ec-45e4-a981-ee17af923a95" path="/var/lib/kubelet/pods/6f4a4c70-f6ec-45e4-a981-ee17af923a95/volumes" Feb 27 06:31:49 crc kubenswrapper[4725]: I0227 06:31:49.927879 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Feb 27 06:31:49 crc kubenswrapper[4725]: I0227 06:31:49.928851 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:31:51 crc kubenswrapper[4725]: E0227 06:31:51.965186 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 27 06:31:51 crc kubenswrapper[4725]: E0227 06:31:51.965580 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Feb 27 06:31:51 crc kubenswrapper[4725]: E0227 06:31:51.965703 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.203:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wl8fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-zb7cc_openstack(617c62fd-dee8-4bab-a69a-8f348c8487a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:31:51 crc kubenswrapper[4725]: E0227 06:31:51.966886 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-zb7cc" podUID="617c62fd-dee8-4bab-a69a-8f348c8487a3" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.056384 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.064566 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198096 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-horizon-secret-key\") pod \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198173 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-horizon-secret-key\") pod \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198221 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7449x\" (UniqueName: \"kubernetes.io/projected/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-kube-api-access-7449x\") pod \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198250 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-logs\") pod \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198311 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-config-data\") pod \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198362 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-scripts\") pod \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198402 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-config-data\") pod \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198419 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8xkx\" (UniqueName: \"kubernetes.io/projected/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-kube-api-access-p8xkx\") pod \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\" (UID: \"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198494 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-logs\") pod \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.198542 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-scripts\") pod \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\" (UID: \"3e4ea208-f6a4-40f8-9b4e-14bfece1614d\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.199270 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-scripts" (OuterVolumeSpecName: "scripts") pod "3e4ea208-f6a4-40f8-9b4e-14bfece1614d" (UID: "3e4ea208-f6a4-40f8-9b4e-14bfece1614d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.199475 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-config-data" (OuterVolumeSpecName: "config-data") pod "3e4ea208-f6a4-40f8-9b4e-14bfece1614d" (UID: "3e4ea208-f6a4-40f8-9b4e-14bfece1614d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.199661 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-scripts" (OuterVolumeSpecName: "scripts") pod "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156" (UID: "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.200157 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-config-data" (OuterVolumeSpecName: "config-data") pod "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156" (UID: "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.200822 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-logs" (OuterVolumeSpecName: "logs") pod "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156" (UID: "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.200929 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-logs" (OuterVolumeSpecName: "logs") pod "3e4ea208-f6a4-40f8-9b4e-14bfece1614d" (UID: "3e4ea208-f6a4-40f8-9b4e-14bfece1614d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.205451 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-kube-api-access-p8xkx" (OuterVolumeSpecName: "kube-api-access-p8xkx") pod "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156" (UID: "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156"). InnerVolumeSpecName "kube-api-access-p8xkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.205998 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3e4ea208-f6a4-40f8-9b4e-14bfece1614d" (UID: "3e4ea208-f6a4-40f8-9b4e-14bfece1614d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.212844 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-kube-api-access-7449x" (OuterVolumeSpecName: "kube-api-access-7449x") pod "3e4ea208-f6a4-40f8-9b4e-14bfece1614d" (UID: "3e4ea208-f6a4-40f8-9b4e-14bfece1614d"). InnerVolumeSpecName "kube-api-access-7449x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.212965 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156" (UID: "b9bc5d37-f0f1-4aee-a7d3-040b6c53b156"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302052 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302086 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302099 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302132 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302144 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7449x\" (UniqueName: \"kubernetes.io/projected/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-kube-api-access-7449x\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302156 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302167 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4ea208-f6a4-40f8-9b4e-14bfece1614d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302178 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302205 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.302217 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8xkx\" (UniqueName: \"kubernetes.io/projected/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156-kube-api-access-p8xkx\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.371584 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f5b48d5-4gn6c" event={"ID":"b9bc5d37-f0f1-4aee-a7d3-040b6c53b156","Type":"ContainerDied","Data":"d9df58203eaf1c115de04cae850e3fbf818869900cfcf9ae877c2e8fd9e63c94"} Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.371651 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f5b48d5-4gn6c" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.374680 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8b7f8fb9-vxlzg" event={"ID":"3e4ea208-f6a4-40f8-9b4e-14bfece1614d","Type":"ContainerDied","Data":"494a854070209818598a5d8eba6a3e5c28924ce079637de55483da8606787647"} Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.374765 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8b7f8fb9-vxlzg" Feb 27 06:31:52 crc kubenswrapper[4725]: E0227 06:31:52.376740 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-zb7cc" podUID="617c62fd-dee8-4bab-a69a-8f348c8487a3" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.441399 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594f5b48d5-4gn6c"] Feb 27 06:31:52 crc kubenswrapper[4725]: E0227 06:31:52.442244 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Feb 27 06:31:52 crc kubenswrapper[4725]: E0227 06:31:52.442309 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Feb 27 06:31:52 crc kubenswrapper[4725]: E0227 06:31:52.442421 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.203:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h8dhd9hf4h555h67dh645h556hcfh85h669hdh664hb5h67ch5b4h54dh68ch685h5bbh74h549hfh686h59bh55bh55h6hf4h66fh55h65cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bp94q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(f817188c-5563-4b93-abe7-94305a5c95a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.460069 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-594f5b48d5-4gn6c"] Feb 27 06:31:52 crc kubenswrapper[4725]: W0227 06:31:52.470875 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852f1c61_8d84_42fd_bed8_be55f65b3a4c.slice/crio-f6a7b38351e84f511258efcd42c3df7ede9f73d18816212b096436cedfb57e81 WatchSource:0}: Error finding container f6a7b38351e84f511258efcd42c3df7ede9f73d18816212b096436cedfb57e81: Status 404 returned error can't find the container with id f6a7b38351e84f511258efcd42c3df7ede9f73d18816212b096436cedfb57e81 Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.482673 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c8b7f8fb9-vxlzg"] Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.490901 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c8b7f8fb9-vxlzg"] Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.580729 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.585666 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710192 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710316 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-logs\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710339 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-sb\") pod \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710427 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz2qt\" (UniqueName: \"kubernetes.io/projected/aa2e7bcc-3655-4f34-8f6b-1cc325681122-kube-api-access-qz2qt\") pod \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710468 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-scripts\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710553 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-config-data\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710581 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-config\") pod \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710597 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9bfr\" (UniqueName: \"kubernetes.io/projected/2fb89808-3a91-47ca-8d7a-c96f22784048-kube-api-access-g9bfr\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710611 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-nb\") pod \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710635 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-combined-ca-bundle\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710662 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-httpd-run\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710711 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-dns-svc\") pod \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\" (UID: \"aa2e7bcc-3655-4f34-8f6b-1cc325681122\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.710736 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-internal-tls-certs\") pod \"2fb89808-3a91-47ca-8d7a-c96f22784048\" (UID: \"2fb89808-3a91-47ca-8d7a-c96f22784048\") " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.714039 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.714482 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-logs" (OuterVolumeSpecName: "logs") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.717256 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2e7bcc-3655-4f34-8f6b-1cc325681122-kube-api-access-qz2qt" (OuterVolumeSpecName: "kube-api-access-qz2qt") pod "aa2e7bcc-3655-4f34-8f6b-1cc325681122" (UID: "aa2e7bcc-3655-4f34-8f6b-1cc325681122"). InnerVolumeSpecName "kube-api-access-qz2qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.718699 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-scripts" (OuterVolumeSpecName: "scripts") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.721836 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb89808-3a91-47ca-8d7a-c96f22784048-kube-api-access-g9bfr" (OuterVolumeSpecName: "kube-api-access-g9bfr") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "kube-api-access-g9bfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.724122 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.743806 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.761241 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa2e7bcc-3655-4f34-8f6b-1cc325681122" (UID: "aa2e7bcc-3655-4f34-8f6b-1cc325681122"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.770371 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-config" (OuterVolumeSpecName: "config") pod "aa2e7bcc-3655-4f34-8f6b-1cc325681122" (UID: "aa2e7bcc-3655-4f34-8f6b-1cc325681122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.770401 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.773945 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa2e7bcc-3655-4f34-8f6b-1cc325681122" (UID: "aa2e7bcc-3655-4f34-8f6b-1cc325681122"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.786604 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-config-data" (OuterVolumeSpecName: "config-data") pod "2fb89808-3a91-47ca-8d7a-c96f22784048" (UID: "2fb89808-3a91-47ca-8d7a-c96f22784048"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.794082 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa2e7bcc-3655-4f34-8f6b-1cc325681122" (UID: "aa2e7bcc-3655-4f34-8f6b-1cc325681122"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812860 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812883 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812894 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz2qt\" (UniqueName: \"kubernetes.io/projected/aa2e7bcc-3655-4f34-8f6b-1cc325681122-kube-api-access-qz2qt\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812904 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812912 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812920 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812928 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9bfr\" (UniqueName: \"kubernetes.io/projected/2fb89808-3a91-47ca-8d7a-c96f22784048-kube-api-access-g9bfr\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812937 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812945 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812952 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fb89808-3a91-47ca-8d7a-c96f22784048-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812960 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2e7bcc-3655-4f34-8f6b-1cc325681122-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812967 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fb89808-3a91-47ca-8d7a-c96f22784048-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.812993 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.830524 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 27 06:31:52 crc kubenswrapper[4725]: I0227 06:31:52.915836 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.383393 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"852f1c61-8d84-42fd-bed8-be55f65b3a4c","Type":"ContainerStarted","Data":"f6a7b38351e84f511258efcd42c3df7ede9f73d18816212b096436cedfb57e81"} Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.385325 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" event={"ID":"aa2e7bcc-3655-4f34-8f6b-1cc325681122","Type":"ContainerDied","Data":"b5b60f0bbf1301497a8acd12403a2147fff7747f2d0ff2c5b834b2bebc81ae13"} Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.385368 4725 scope.go:117] "RemoveContainer" containerID="c83f0cda61e0eeca3c659f093826b055b5486959ac3ce506b78bc7b5675e6502" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.385480 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.390245 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2fb89808-3a91-47ca-8d7a-c96f22784048","Type":"ContainerDied","Data":"9e5ca77deee1b4d02b1d91c18a162b058d877d99dd69328d71237ed12379530a"} Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.390463 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.392539 4725 generic.go:334] "Generic (PLEG): container finished" podID="ab103cbe-a833-4f47-8101-d9ea92afe59c" containerID="8d28ad203b2aeaee1320d261fb493db88dc11595d7154df0162195fba412ad0c" exitCode=0 Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.392590 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvhmj" event={"ID":"ab103cbe-a833-4f47-8101-d9ea92afe59c","Type":"ContainerDied","Data":"8d28ad203b2aeaee1320d261fb493db88dc11595d7154df0162195fba412ad0c"} Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.448801 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444c9d757-pr69s"] Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.458316 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444c9d757-pr69s"] Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.470544 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.479745 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.485529 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.486017 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-httpd" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.486086 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-httpd" Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.486164 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="init" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.486230 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="init" Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.486302 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-log" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.486354 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-log" Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.486422 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.486478 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.486702 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-log" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.486778 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" containerName="glance-httpd" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.486844 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.487808 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.492270 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.492553 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.492892 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.519265 4725 scope.go:117] "RemoveContainer" containerID="3ce4aa8455e0478d967e778306adbf24b533de04af0ffe96be00d005cbc134a7" Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.554559 4725 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.554606 4725 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.203:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.554740 4725 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.203:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bmj4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-5v8pq_openstack(6a306be2-547c-404f-afc1-4f4639cf7a28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 06:31:53 crc kubenswrapper[4725]: E0227 06:31:53.562466 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-5v8pq" podUID="6a306be2-547c-404f-afc1-4f4639cf7a28" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.649446 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.649549 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.649580 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.649829 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl79w\" (UniqueName: \"kubernetes.io/projected/66dd6dba-4ab7-4fa0-88a5-abdccb202492-kube-api-access-hl79w\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.649880 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-logs\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.649959 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.650069 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.650106 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.751635 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.751915 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.751945 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.751985 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.752019 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.752042 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.752093 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl79w\" (UniqueName: \"kubernetes.io/projected/66dd6dba-4ab7-4fa0-88a5-abdccb202492-kube-api-access-hl79w\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.752110 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-logs\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.752628 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.753449 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.761237 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-config-data\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.769157 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-logs\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.770139 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl79w\" (UniqueName: \"kubernetes.io/projected/66dd6dba-4ab7-4fa0-88a5-abdccb202492-kube-api-access-hl79w\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.772307 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.787334 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-scripts\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.787911 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.803475 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.826167 4725 scope.go:117] "RemoveContainer" containerID="659f36e0a32a2c589513eea645018110aebafdfd17701bdc5e62ce81025d429d" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.887232 4725 scope.go:117] "RemoveContainer" containerID="8702692fe9211a5826a801eaf4c11c4ebc38030ec3e2ac5a14ff8405b4b6f679" Feb 27 06:31:53 crc kubenswrapper[4725]: I0227 06:31:53.978927 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.058785 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c7fnv"] Feb 27 06:31:54 crc kubenswrapper[4725]: W0227 06:31:54.066326 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab5820b_1151_4cc9_ae7b_09b596335d88.slice/crio-1a34d75dfde83df1eabb1279409a2db854e70c60f91202ea890bf0406e7670d0 WatchSource:0}: Error finding container 1a34d75dfde83df1eabb1279409a2db854e70c60f91202ea890bf0406e7670d0: Status 404 returned error can't find the container with id 1a34d75dfde83df1eabb1279409a2db854e70c60f91202ea890bf0406e7670d0 Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.180269 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56865cdb4-9hs85"] Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.191401 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f478fcd58-cfjzp"] Feb 27 06:31:54 crc kubenswrapper[4725]: W0227 06:31:54.213845 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4fc5fc3_880a_46c5_a0a1_3248884d9882.slice/crio-d7b6399bad48d193a2b88629e2ab802c17ead92bb73cce34440180ab7c5d8b73 WatchSource:0}: Error finding container d7b6399bad48d193a2b88629e2ab802c17ead92bb73cce34440180ab7c5d8b73: Status 404 returned error can't find the container with id d7b6399bad48d193a2b88629e2ab802c17ead92bb73cce34440180ab7c5d8b73 Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.294537 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb89808-3a91-47ca-8d7a-c96f22784048" path="/var/lib/kubelet/pods/2fb89808-3a91-47ca-8d7a-c96f22784048/volumes" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.297463 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4ea208-f6a4-40f8-9b4e-14bfece1614d" path="/var/lib/kubelet/pods/3e4ea208-f6a4-40f8-9b4e-14bfece1614d/volumes" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.297908 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" path="/var/lib/kubelet/pods/aa2e7bcc-3655-4f34-8f6b-1cc325681122/volumes" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.299135 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bc5d37-f0f1-4aee-a7d3-040b6c53b156" path="/var/lib/kubelet/pods/b9bc5d37-f0f1-4aee-a7d3-040b6c53b156/volumes" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.403988 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"852f1c61-8d84-42fd-bed8-be55f65b3a4c","Type":"ContainerStarted","Data":"642bae2deabf590735398bfe29ed50f4cc91acbbeff8f43348da8d07230674db"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.417808 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fxbvl" event={"ID":"4b95598a-2902-4372-b9f4-a40152f1c45f","Type":"ContainerStarted","Data":"8794d1147b6b8db32bd737d441e4fc901912bda9cdc0753005c03ee0e1534914"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.423811 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerStarted","Data":"e361e5e717fb7489267b59facc66846148c3a089296f8966f2c85e3d4c673f27"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.431203 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56865cdb4-9hs85" event={"ID":"a4fc5fc3-880a-46c5-a0a1-3248884d9882","Type":"ContainerStarted","Data":"d7b6399bad48d193a2b88629e2ab802c17ead92bb73cce34440180ab7c5d8b73"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.438127 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fxbvl" podStartSLOduration=3.368711092 podStartE2EDuration="32.438107321s" podCreationTimestamp="2026-02-27 06:31:22 +0000 UTC" firstStartedPulling="2026-02-27 06:31:24.484275988 +0000 UTC m=+1262.946896557" lastFinishedPulling="2026-02-27 06:31:53.553672207 +0000 UTC m=+1292.016292786" observedRunningTime="2026-02-27 06:31:54.437557226 +0000 UTC m=+1292.900177785" watchObservedRunningTime="2026-02-27 06:31:54.438107321 +0000 UTC m=+1292.900727890" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.450623 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-log" containerID="cri-o://13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9" gracePeriod=30 Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.450748 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-httpd" containerID="cri-o://1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6" gracePeriod=30 Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.451130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154e40c2-11b3-4eec-93c1-6dc57202ac90","Type":"ContainerStarted","Data":"1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.453875 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f478fcd58-cfjzp" event={"ID":"372d4de4-ea8f-4393-af8b-1139e593ac16","Type":"ContainerStarted","Data":"f11aff1e725091cc169c269e13fd629048df45c670fc7bdbc8f36852c5754600"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.461718 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=14.606720287 podStartE2EDuration="33.461701965s" podCreationTimestamp="2026-02-27 06:31:21 +0000 UTC" firstStartedPulling="2026-02-27 06:31:23.661877139 +0000 UTC m=+1262.124497698" lastFinishedPulling="2026-02-27 06:31:42.516858777 +0000 UTC m=+1280.979479376" observedRunningTime="2026-02-27 06:31:54.455224882 +0000 UTC m=+1292.917845461" watchObservedRunningTime="2026-02-27 06:31:54.461701965 +0000 UTC m=+1292.924322534" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.469953 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a4dc45b9-7d75-443c-8712-44dca955b02d","Type":"ContainerStarted","Data":"e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.474700 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc5ddffd5-r9bpn" event={"ID":"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd","Type":"ContainerStarted","Data":"b1aa00d211f4acd6b408c48a72dc95d48f813b8e5132ac819a91abc1e5a48f5b"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.485312 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7fnv" event={"ID":"9ab5820b-1151-4cc9-ae7b-09b596335d88","Type":"ContainerStarted","Data":"cb419507ea31548cd9839a0593ffb565ab0a0f14c6be351d8555d15864a47637"} Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.485351 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7fnv" event={"ID":"9ab5820b-1151-4cc9-ae7b-09b596335d88","Type":"ContainerStarted","Data":"1a34d75dfde83df1eabb1279409a2db854e70c60f91202ea890bf0406e7670d0"} Feb 27 06:31:54 crc kubenswrapper[4725]: E0227 06:31:54.487410 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.203:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-5v8pq" podUID="6a306be2-547c-404f-afc1-4f4639cf7a28" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.490258 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=32.490244217 podStartE2EDuration="32.490244217s" podCreationTimestamp="2026-02-27 06:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:54.489248349 +0000 UTC m=+1292.951868938" watchObservedRunningTime="2026-02-27 06:31:54.490244217 +0000 UTC m=+1292.952864786" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.527983 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-c7fnv" podStartSLOduration=11.527957918 podStartE2EDuration="11.527957918s" podCreationTimestamp="2026-02-27 06:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:54.517654158 +0000 UTC m=+1292.980274747" watchObservedRunningTime="2026-02-27 06:31:54.527957918 +0000 UTC m=+1292.990578487" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.544046 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=26.81883559 podStartE2EDuration="33.543982739s" podCreationTimestamp="2026-02-27 06:31:21 +0000 UTC" firstStartedPulling="2026-02-27 06:31:23.645276612 +0000 UTC m=+1262.107897171" lastFinishedPulling="2026-02-27 06:31:30.370423751 +0000 UTC m=+1268.833044320" observedRunningTime="2026-02-27 06:31:54.536189759 +0000 UTC m=+1292.998810348" watchObservedRunningTime="2026-02-27 06:31:54.543982739 +0000 UTC m=+1293.006603308" Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.618196 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:31:54 crc kubenswrapper[4725]: I0227 06:31:54.929563 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6444c9d757-pr69s" podUID="aa2e7bcc-3655-4f34-8f6b-1cc325681122" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.174107 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.288443 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7kds\" (UniqueName: \"kubernetes.io/projected/ab103cbe-a833-4f47-8101-d9ea92afe59c-kube-api-access-v7kds\") pod \"ab103cbe-a833-4f47-8101-d9ea92afe59c\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.288984 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-combined-ca-bundle\") pod \"ab103cbe-a833-4f47-8101-d9ea92afe59c\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.289184 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-config\") pod \"ab103cbe-a833-4f47-8101-d9ea92afe59c\" (UID: \"ab103cbe-a833-4f47-8101-d9ea92afe59c\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.312453 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab103cbe-a833-4f47-8101-d9ea92afe59c-kube-api-access-v7kds" (OuterVolumeSpecName: "kube-api-access-v7kds") pod "ab103cbe-a833-4f47-8101-d9ea92afe59c" (UID: "ab103cbe-a833-4f47-8101-d9ea92afe59c"). InnerVolumeSpecName "kube-api-access-v7kds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.329449 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-config" (OuterVolumeSpecName: "config") pod "ab103cbe-a833-4f47-8101-d9ea92afe59c" (UID: "ab103cbe-a833-4f47-8101-d9ea92afe59c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.354996 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab103cbe-a833-4f47-8101-d9ea92afe59c" (UID: "ab103cbe-a833-4f47-8101-d9ea92afe59c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.394685 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.394705 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7kds\" (UniqueName: \"kubernetes.io/projected/ab103cbe-a833-4f47-8101-d9ea92afe59c-kube-api-access-v7kds\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.394715 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab103cbe-a833-4f47-8101-d9ea92afe59c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.522042 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f478fcd58-cfjzp" event={"ID":"372d4de4-ea8f-4393-af8b-1139e593ac16","Type":"ContainerStarted","Data":"55b70c7dda9ddff61b689ffd66fe36eaf900a49e159843d02af5fc765f40c9b6"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.522091 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f478fcd58-cfjzp" event={"ID":"372d4de4-ea8f-4393-af8b-1139e593ac16","Type":"ContainerStarted","Data":"32ffb7d5210ab48f38b78570ddc91f95c38f97909105be43dd53a303bb2a880e"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.595534 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc5ddffd5-r9bpn" event={"ID":"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd","Type":"ContainerStarted","Data":"bde0c1fffb3692c65b3046d03b17cbac44a4d9aaeb7e486a9095e3acbf9352bf"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.595726 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cc5ddffd5-r9bpn" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon-log" containerID="cri-o://b1aa00d211f4acd6b408c48a72dc95d48f813b8e5132ac819a91abc1e5a48f5b" gracePeriod=30 Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.596038 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cc5ddffd5-r9bpn" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon" containerID="cri-o://bde0c1fffb3692c65b3046d03b17cbac44a4d9aaeb7e486a9095e3acbf9352bf" gracePeriod=30 Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.609978 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bbccbc7cf-sts7s"] Feb 27 06:31:55 crc kubenswrapper[4725]: E0227 06:31:55.616517 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab103cbe-a833-4f47-8101-d9ea92afe59c" containerName="neutron-db-sync" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.616554 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab103cbe-a833-4f47-8101-d9ea92afe59c" containerName="neutron-db-sync" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.617873 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab103cbe-a833-4f47-8101-d9ea92afe59c" containerName="neutron-db-sync" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.627596 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.636467 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.636640 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvhmj" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.636657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvhmj" event={"ID":"ab103cbe-a833-4f47-8101-d9ea92afe59c","Type":"ContainerDied","Data":"681154434b36c5f6e9d8d1749cb3ab467ae584702fc3afd560bdb9ca63648068"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.639092 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="681154434b36c5f6e9d8d1749cb3ab467ae584702fc3afd560bdb9ca63648068" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.640308 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66dd6dba-4ab7-4fa0-88a5-abdccb202492","Type":"ContainerStarted","Data":"84eb2e2e64d82dbee7a2bb41cce66f7440ba036c6a6d6291693f791609b406ef"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.653607 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"852f1c61-8d84-42fd-bed8-be55f65b3a4c","Type":"ContainerStarted","Data":"aec86448eabc75a84f0a93a84c8ad2c642a8946ca752fa9c488a9ec4aabdca28"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.655532 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.670764 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7f478fcd58-cfjzp" podStartSLOduration=24.670744428 podStartE2EDuration="24.670744428s" podCreationTimestamp="2026-02-27 06:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:55.562215186 +0000 UTC m=+1294.024835755" watchObservedRunningTime="2026-02-27 06:31:55.670744428 +0000 UTC m=+1294.133364997" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.687358 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bbccbc7cf-sts7s"] Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.702657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56865cdb4-9hs85" event={"ID":"a4fc5fc3-880a-46c5-a0a1-3248884d9882","Type":"ContainerStarted","Data":"0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.709531 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-nb\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.709582 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7r7\" (UniqueName: \"kubernetes.io/projected/480f17b9-c37d-4cc1-a611-9500bae66f11-kube-api-access-ww7r7\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.709685 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-sb\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.709709 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-swift-storage-0\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.709743 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-config\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.709762 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-svc\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.740220 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cc5ddffd5-r9bpn" podStartSLOduration=4.869001491 podStartE2EDuration="34.74019954s" podCreationTimestamp="2026-02-27 06:31:21 +0000 UTC" firstStartedPulling="2026-02-27 06:31:23.650794267 +0000 UTC m=+1262.113414836" lastFinishedPulling="2026-02-27 06:31:53.521992316 +0000 UTC m=+1291.984612885" observedRunningTime="2026-02-27 06:31:55.623151389 +0000 UTC m=+1294.085771958" watchObservedRunningTime="2026-02-27 06:31:55.74019954 +0000 UTC m=+1294.202820109" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.740681 4725 generic.go:334] "Generic (PLEG): container finished" podID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerID="1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6" exitCode=143 Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.740715 4725 generic.go:334] "Generic (PLEG): container finished" podID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerID="13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9" exitCode=143 Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.741240 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.741375 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154e40c2-11b3-4eec-93c1-6dc57202ac90","Type":"ContainerDied","Data":"1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.741449 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154e40c2-11b3-4eec-93c1-6dc57202ac90","Type":"ContainerDied","Data":"13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9"} Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.741489 4725 scope.go:117] "RemoveContainer" containerID="1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812244 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-config-data\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812599 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-combined-ca-bundle\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812627 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-public-tls-certs\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812650 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-httpd-run\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812707 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-logs\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812728 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czbvr\" (UniqueName: \"kubernetes.io/projected/154e40c2-11b3-4eec-93c1-6dc57202ac90-kube-api-access-czbvr\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812807 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.812826 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-scripts\") pod \"154e40c2-11b3-4eec-93c1-6dc57202ac90\" (UID: \"154e40c2-11b3-4eec-93c1-6dc57202ac90\") " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.813065 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-sb\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.813086 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-swift-storage-0\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.813131 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-config\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.813161 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-svc\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.813263 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-nb\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.813725 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-logs" (OuterVolumeSpecName: "logs") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.814184 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.816203 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-config\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.823439 4725 scope.go:117] "RemoveContainer" containerID="13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.826713 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154e40c2-11b3-4eec-93c1-6dc57202ac90-kube-api-access-czbvr" (OuterVolumeSpecName: "kube-api-access-czbvr") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "kube-api-access-czbvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.826796 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.829466 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7r7\" (UniqueName: \"kubernetes.io/projected/480f17b9-c37d-4cc1-a611-9500bae66f11-kube-api-access-ww7r7\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830053 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830075 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154e40c2-11b3-4eec-93c1-6dc57202ac90-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830086 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czbvr\" (UniqueName: \"kubernetes.io/projected/154e40c2-11b3-4eec-93c1-6dc57202ac90-kube-api-access-czbvr\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830108 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830660 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-nb\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830682 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-sb\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830660 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-svc\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.830750 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-swift-storage-0\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.835039 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=28.835020957 podStartE2EDuration="28.835020957s" podCreationTimestamp="2026-02-27 06:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:55.723684687 +0000 UTC m=+1294.186305266" watchObservedRunningTime="2026-02-27 06:31:55.835020957 +0000 UTC m=+1294.297641526" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.837705 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-scripts" (OuterVolumeSpecName: "scripts") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.848581 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.854459 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7r7\" (UniqueName: \"kubernetes.io/projected/480f17b9-c37d-4cc1-a611-9500bae66f11-kube-api-access-ww7r7\") pod \"dnsmasq-dns-6bbccbc7cf-sts7s\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.857501 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78d8db4dd4-8t58x"] Feb 27 06:31:55 crc kubenswrapper[4725]: E0227 06:31:55.858010 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-log" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.858031 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-log" Feb 27 06:31:55 crc kubenswrapper[4725]: E0227 06:31:55.858044 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-httpd" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.858051 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-httpd" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.858234 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-httpd" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.858253 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" containerName="glance-log" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.859408 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.861682 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.866073 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.866335 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.866522 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mpctd" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.870610 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78d8db4dd4-8t58x"] Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.879437 4725 scope.go:117] "RemoveContainer" containerID="1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.879687 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56865cdb4-9hs85" podStartSLOduration=24.879666173 podStartE2EDuration="24.879666173s" podCreationTimestamp="2026-02-27 06:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:55.786347058 +0000 UTC m=+1294.248967627" watchObservedRunningTime="2026-02-27 06:31:55.879666173 +0000 UTC m=+1294.342286742" Feb 27 06:31:55 crc kubenswrapper[4725]: E0227 06:31:55.884714 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6\": container with ID starting with 1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6 not found: ID does not exist" containerID="1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.884759 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6"} err="failed to get container status \"1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6\": rpc error: code = NotFound desc = could not find container \"1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6\": container with ID starting with 1ef8878e1088319dadec86ac3a5fa30f8e13546ed9a2dc0fa623d903984f0ea6 not found: ID does not exist" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.884782 4725 scope.go:117] "RemoveContainer" containerID="13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.886415 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 27 06:31:55 crc kubenswrapper[4725]: E0227 06:31:55.891695 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9\": container with ID starting with 13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9 not found: ID does not exist" containerID="13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.891728 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9"} err="failed to get container status \"13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9\": rpc error: code = NotFound desc = could not find container \"13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9\": container with ID starting with 13d3ee201e4d142fa2f1caa3107dd58801bfaba04d552ea5d63fbefb748ec6f9 not found: ID does not exist" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.895524 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.895683 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-config-data" (OuterVolumeSpecName: "config-data") pod "154e40c2-11b3-4eec-93c1-6dc57202ac90" (UID: "154e40c2-11b3-4eec-93c1-6dc57202ac90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.931715 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-httpd-config\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.931789 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jstn\" (UniqueName: \"kubernetes.io/projected/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-kube-api-access-2jstn\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.931833 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-config\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.931851 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-ovndb-tls-certs\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.932019 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-combined-ca-bundle\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.934211 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.934243 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.934254 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.934264 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.934273 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154e40c2-11b3-4eec-93c1-6dc57202ac90-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:31:55 crc kubenswrapper[4725]: I0227 06:31:55.950177 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.036200 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-combined-ca-bundle\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.036275 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-httpd-config\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.036330 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jstn\" (UniqueName: \"kubernetes.io/projected/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-kube-api-access-2jstn\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.036354 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-config\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.036373 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-ovndb-tls-certs\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.047082 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-combined-ca-bundle\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.047117 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-config\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.049406 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-httpd-config\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.066467 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-ovndb-tls-certs\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.073925 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jstn\" (UniqueName: \"kubernetes.io/projected/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-kube-api-access-2jstn\") pod \"neutron-78d8db4dd4-8t58x\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.191810 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.202361 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.245914 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.313577 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154e40c2-11b3-4eec-93c1-6dc57202ac90" path="/var/lib/kubelet/pods/154e40c2-11b3-4eec-93c1-6dc57202ac90/volumes" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.314331 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.316197 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.342883 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.343739 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.397749 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.467643 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.467681 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-config-data\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.468045 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-logs\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.468076 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.468096 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.468135 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-scripts\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.468153 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42r6d\" (UniqueName: \"kubernetes.io/projected/a677ff7e-bb0e-4493-897b-c25dab79e22e-kube-api-access-42r6d\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.468193 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.569922 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570177 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-scripts\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570198 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42r6d\" (UniqueName: \"kubernetes.io/projected/a677ff7e-bb0e-4493-897b-c25dab79e22e-kube-api-access-42r6d\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570241 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570346 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570365 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-config-data\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570400 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-logs\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570419 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.570902 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.571537 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.572428 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-logs\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.587801 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.589984 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.590690 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-config-data\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.591357 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-scripts\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.601203 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42r6d\" (UniqueName: \"kubernetes.io/projected/a677ff7e-bb0e-4493-897b-c25dab79e22e-kube-api-access-42r6d\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.611650 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.735839 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.741143 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bbccbc7cf-sts7s"] Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.761774 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56865cdb4-9hs85" event={"ID":"a4fc5fc3-880a-46c5-a0a1-3248884d9882","Type":"ContainerStarted","Data":"07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998"} Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.767512 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66dd6dba-4ab7-4fa0-88a5-abdccb202492","Type":"ContainerStarted","Data":"2a038dd6688e01b1305a86954662a27fed8e771b340b874444f3b95a6a5be964"} Feb 27 06:31:56 crc kubenswrapper[4725]: I0227 06:31:56.799104 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerStarted","Data":"8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff"} Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.100417 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78d8db4dd4-8t58x"] Feb 27 06:31:57 crc kubenswrapper[4725]: W0227 06:31:57.139133 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52add5b7_bfbe_4d4c_ad4c_bf26a3afa096.slice/crio-48f69990b07a5f275825cb0551fd7f201396aca9f56a9558e4af3c98236f6f0a WatchSource:0}: Error finding container 48f69990b07a5f275825cb0551fd7f201396aca9f56a9558e4af3c98236f6f0a: Status 404 returned error can't find the container with id 48f69990b07a5f275825cb0551fd7f201396aca9f56a9558e4af3c98236f6f0a Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.389508 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.424568 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.493903 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.493937 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.837328 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66dd6dba-4ab7-4fa0-88a5-abdccb202492","Type":"ContainerStarted","Data":"ddeb12551e0545cef274b108a5df028e84004c848959f496b5e492599e202081"} Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.845686 4725 generic.go:334] "Generic (PLEG): container finished" podID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerID="265436abc44d2f8a2afc887c02aa8aa8de69f672d6e37798fc875d1ed616e468" exitCode=0 Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.845745 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" event={"ID":"480f17b9-c37d-4cc1-a611-9500bae66f11","Type":"ContainerDied","Data":"265436abc44d2f8a2afc887c02aa8aa8de69f672d6e37798fc875d1ed616e468"} Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.845769 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" event={"ID":"480f17b9-c37d-4cc1-a611-9500bae66f11","Type":"ContainerStarted","Data":"444374ca1e3de85d55d7e6fff65f20283878b893c1b056a0b39ba9c8956684eb"} Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.854955 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a677ff7e-bb0e-4493-897b-c25dab79e22e","Type":"ContainerStarted","Data":"83f34c871821d0534c30f1afbd92a2903532c8c897229713baecacb2f321b7da"} Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.878520 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d8db4dd4-8t58x" event={"ID":"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096","Type":"ContainerStarted","Data":"e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90"} Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.878779 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d8db4dd4-8t58x" event={"ID":"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096","Type":"ContainerStarted","Data":"48f69990b07a5f275825cb0551fd7f201396aca9f56a9558e4af3c98236f6f0a"} Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.880023 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:31:57 crc kubenswrapper[4725]: I0227 06:31:57.887466 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.88744219 podStartE2EDuration="4.88744219s" podCreationTimestamp="2026-02-27 06:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:57.860558084 +0000 UTC m=+1296.323178653" watchObservedRunningTime="2026-02-27 06:31:57.88744219 +0000 UTC m=+1296.350062779" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.535505 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.587077 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b44778b65-pk2td"] Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.588530 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.590800 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.591949 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.611159 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b44778b65-pk2td"] Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.725294 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdf2r\" (UniqueName: \"kubernetes.io/projected/4598bdbc-f18b-4709-baa9-013d097a4dfc-kube-api-access-cdf2r\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.725356 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-httpd-config\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.725384 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-internal-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.725431 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-combined-ca-bundle\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.725450 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-config\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.725466 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-ovndb-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.725504 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-public-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.826869 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-public-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.826950 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdf2r\" (UniqueName: \"kubernetes.io/projected/4598bdbc-f18b-4709-baa9-013d097a4dfc-kube-api-access-cdf2r\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.826988 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-httpd-config\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.827009 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-internal-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.827055 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-combined-ca-bundle\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.827076 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-config\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.827091 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-ovndb-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.833015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-ovndb-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.833173 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-config\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.833179 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-httpd-config\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.833890 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-internal-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.838767 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-combined-ca-bundle\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.842604 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-public-tls-certs\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.844920 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdf2r\" (UniqueName: \"kubernetes.io/projected/4598bdbc-f18b-4709-baa9-013d097a4dfc-kube-api-access-cdf2r\") pod \"neutron-6b44778b65-pk2td\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.896890 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a677ff7e-bb0e-4493-897b-c25dab79e22e","Type":"ContainerStarted","Data":"4c97ac891aff3f694b1d884181146abb458296e7945a6ee0c74573d4a0679297"} Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.903985 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d8db4dd4-8t58x" event={"ID":"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096","Type":"ContainerStarted","Data":"a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17"} Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.905149 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.923822 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.927180 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78d8db4dd4-8t58x" podStartSLOduration=3.927161472 podStartE2EDuration="3.927161472s" podCreationTimestamp="2026-02-27 06:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:58.922277724 +0000 UTC m=+1297.384898303" watchObservedRunningTime="2026-02-27 06:31:58.927161472 +0000 UTC m=+1297.389782041" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.939467 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.939487 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" event={"ID":"480f17b9-c37d-4cc1-a611-9500bae66f11","Type":"ContainerStarted","Data":"a2015d783012eb40f3a5917c9c739000fc14f1a0864079acfb13cd89bf9b3a4a"} Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.940497 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:31:58 crc kubenswrapper[4725]: I0227 06:31:58.968740 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" podStartSLOduration=3.968724971 podStartE2EDuration="3.968724971s" podCreationTimestamp="2026-02-27 06:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:31:58.95802799 +0000 UTC m=+1297.420648559" watchObservedRunningTime="2026-02-27 06:31:58.968724971 +0000 UTC m=+1297.431345540" Feb 27 06:31:59 crc kubenswrapper[4725]: I0227 06:31:59.445962 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 27 06:31:59 crc kubenswrapper[4725]: I0227 06:31:59.669933 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b44778b65-pk2td"] Feb 27 06:31:59 crc kubenswrapper[4725]: I0227 06:31:59.952469 4725 generic.go:334] "Generic (PLEG): container finished" podID="4b95598a-2902-4372-b9f4-a40152f1c45f" containerID="8794d1147b6b8db32bd737d441e4fc901912bda9cdc0753005c03ee0e1534914" exitCode=0 Feb 27 06:31:59 crc kubenswrapper[4725]: I0227 06:31:59.952551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fxbvl" event={"ID":"4b95598a-2902-4372-b9f4-a40152f1c45f","Type":"ContainerDied","Data":"8794d1147b6b8db32bd737d441e4fc901912bda9cdc0753005c03ee0e1534914"} Feb 27 06:31:59 crc kubenswrapper[4725]: I0227 06:31:59.961275 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a677ff7e-bb0e-4493-897b-c25dab79e22e","Type":"ContainerStarted","Data":"6cabf474862b2dc817a7f6161f3736738f19b6db8b3cee4102de28ae88b1fcf0"} Feb 27 06:31:59 crc kubenswrapper[4725]: I0227 06:31:59.963125 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b44778b65-pk2td" event={"ID":"4598bdbc-f18b-4709-baa9-013d097a4dfc","Type":"ContainerStarted","Data":"10b80694baa957c82c70925cefebc2813af3180201ce0bc4a33b15ffaeff96d4"} Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.136229 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536232-8xqk7"] Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.137805 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536232-8xqk7" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.144890 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.145120 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.145331 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.153847 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536232-8xqk7"] Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.284676 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjbjr\" (UniqueName: \"kubernetes.io/projected/5976615b-8ad2-4d95-9702-5b003064ee5c-kube-api-access-hjbjr\") pod \"auto-csr-approver-29536232-8xqk7\" (UID: \"5976615b-8ad2-4d95-9702-5b003064ee5c\") " pod="openshift-infra/auto-csr-approver-29536232-8xqk7" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.387159 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjbjr\" (UniqueName: \"kubernetes.io/projected/5976615b-8ad2-4d95-9702-5b003064ee5c-kube-api-access-hjbjr\") pod \"auto-csr-approver-29536232-8xqk7\" (UID: \"5976615b-8ad2-4d95-9702-5b003064ee5c\") " pod="openshift-infra/auto-csr-approver-29536232-8xqk7" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.406742 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjbjr\" (UniqueName: \"kubernetes.io/projected/5976615b-8ad2-4d95-9702-5b003064ee5c-kube-api-access-hjbjr\") pod \"auto-csr-approver-29536232-8xqk7\" (UID: \"5976615b-8ad2-4d95-9702-5b003064ee5c\") " pod="openshift-infra/auto-csr-approver-29536232-8xqk7" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.478324 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536232-8xqk7" Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.979590 4725 generic.go:334] "Generic (PLEG): container finished" podID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerID="e361e5e717fb7489267b59facc66846148c3a089296f8966f2c85e3d4c673f27" exitCode=1 Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.979632 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerDied","Data":"e361e5e717fb7489267b59facc66846148c3a089296f8966f2c85e3d4c673f27"} Feb 27 06:32:00 crc kubenswrapper[4725]: I0227 06:32:00.980863 4725 scope.go:117] "RemoveContainer" containerID="e361e5e717fb7489267b59facc66846148c3a089296f8966f2c85e3d4c673f27" Feb 27 06:32:01 crc kubenswrapper[4725]: I0227 06:32:01.837186 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:32:01 crc kubenswrapper[4725]: I0227 06:32:01.838511 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:32:01 crc kubenswrapper[4725]: I0227 06:32:01.896729 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:32:01 crc kubenswrapper[4725]: I0227 06:32:01.896827 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:32:02 crc kubenswrapper[4725]: E0227 06:32:02.167625 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab5820b_1151_4cc9_ae7b_09b596335d88.slice/crio-conmon-cb419507ea31548cd9839a0593ffb565ab0a0f14c6be351d8555d15864a47637.scope\": RecentStats: unable to find data in memory cache]" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.389206 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.422971 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.479249 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.492563 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.492603 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.554668 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.554898 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.577220 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fxbvl" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.733195 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-config-data\") pod \"4b95598a-2902-4372-b9f4-a40152f1c45f\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.733351 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86lsq\" (UniqueName: \"kubernetes.io/projected/4b95598a-2902-4372-b9f4-a40152f1c45f-kube-api-access-86lsq\") pod \"4b95598a-2902-4372-b9f4-a40152f1c45f\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.733482 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-combined-ca-bundle\") pod \"4b95598a-2902-4372-b9f4-a40152f1c45f\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.733517 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b95598a-2902-4372-b9f4-a40152f1c45f-logs\") pod \"4b95598a-2902-4372-b9f4-a40152f1c45f\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.733581 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-scripts\") pod \"4b95598a-2902-4372-b9f4-a40152f1c45f\" (UID: \"4b95598a-2902-4372-b9f4-a40152f1c45f\") " Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.736496 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b95598a-2902-4372-b9f4-a40152f1c45f-logs" (OuterVolumeSpecName: "logs") pod "4b95598a-2902-4372-b9f4-a40152f1c45f" (UID: "4b95598a-2902-4372-b9f4-a40152f1c45f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.742399 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-scripts" (OuterVolumeSpecName: "scripts") pod "4b95598a-2902-4372-b9f4-a40152f1c45f" (UID: "4b95598a-2902-4372-b9f4-a40152f1c45f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.742437 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b95598a-2902-4372-b9f4-a40152f1c45f-kube-api-access-86lsq" (OuterVolumeSpecName: "kube-api-access-86lsq") pod "4b95598a-2902-4372-b9f4-a40152f1c45f" (UID: "4b95598a-2902-4372-b9f4-a40152f1c45f"). InnerVolumeSpecName "kube-api-access-86lsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.764367 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b95598a-2902-4372-b9f4-a40152f1c45f" (UID: "4b95598a-2902-4372-b9f4-a40152f1c45f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.774988 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-config-data" (OuterVolumeSpecName: "config-data") pod "4b95598a-2902-4372-b9f4-a40152f1c45f" (UID: "4b95598a-2902-4372-b9f4-a40152f1c45f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.837679 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.838009 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b95598a-2902-4372-b9f4-a40152f1c45f-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.838135 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.838149 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b95598a-2902-4372-b9f4-a40152f1c45f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.838157 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86lsq\" (UniqueName: \"kubernetes.io/projected/4b95598a-2902-4372-b9f4-a40152f1c45f-kube-api-access-86lsq\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.844467 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536232-8xqk7"] Feb 27 06:32:02 crc kubenswrapper[4725]: I0227 06:32:02.998214 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b44778b65-pk2td" event={"ID":"4598bdbc-f18b-4709-baa9-013d097a4dfc","Type":"ContainerStarted","Data":"0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d"} Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.001429 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fxbvl" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.001435 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fxbvl" event={"ID":"4b95598a-2902-4372-b9f4-a40152f1c45f","Type":"ContainerDied","Data":"4727e1c073ca2f5af69493d330726c640223c5e8fba93ec9fc13afb251235f57"} Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.001498 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4727e1c073ca2f5af69493d330726c640223c5e8fba93ec9fc13afb251235f57" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.003146 4725 generic.go:334] "Generic (PLEG): container finished" podID="9ab5820b-1151-4cc9-ae7b-09b596335d88" containerID="cb419507ea31548cd9839a0593ffb565ab0a0f14c6be351d8555d15864a47637" exitCode=0 Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.003253 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7fnv" event={"ID":"9ab5820b-1151-4cc9-ae7b-09b596335d88","Type":"ContainerDied","Data":"cb419507ea31548cd9839a0593ffb565ab0a0f14c6be351d8555d15864a47637"} Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.025166 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.025150673 podStartE2EDuration="7.025150673s" podCreationTimestamp="2026-02-27 06:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:03.021670785 +0000 UTC m=+1301.484291354" watchObservedRunningTime="2026-02-27 06:32:03.025150673 +0000 UTC m=+1301.487771242" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.060321 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.105490 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.680441 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-774b97d54d-8xt86"] Feb 27 06:32:03 crc kubenswrapper[4725]: E0227 06:32:03.680805 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b95598a-2902-4372-b9f4-a40152f1c45f" containerName="placement-db-sync" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.680818 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b95598a-2902-4372-b9f4-a40152f1c45f" containerName="placement-db-sync" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.680974 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b95598a-2902-4372-b9f4-a40152f1c45f" containerName="placement-db-sync" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.681855 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.685589 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.685966 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.686034 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.686045 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.686157 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-n8hxg" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.702023 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-774b97d54d-8xt86"] Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.756949 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-combined-ca-bundle\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.757028 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-scripts\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.757126 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768d31aa-5226-4119-b1d2-f66786c695f1-logs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.757261 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-public-tls-certs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.757373 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-config-data\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.757582 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-internal-tls-certs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.757756 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn74s\" (UniqueName: \"kubernetes.io/projected/768d31aa-5226-4119-b1d2-f66786c695f1-kube-api-access-wn74s\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.867203 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn74s\" (UniqueName: \"kubernetes.io/projected/768d31aa-5226-4119-b1d2-f66786c695f1-kube-api-access-wn74s\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.867259 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-combined-ca-bundle\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.867329 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-scripts\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.867414 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768d31aa-5226-4119-b1d2-f66786c695f1-logs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.867466 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-public-tls-certs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.867504 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-config-data\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.867607 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-internal-tls-certs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.868006 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768d31aa-5226-4119-b1d2-f66786c695f1-logs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.874736 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-config-data\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.875137 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-internal-tls-certs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.881142 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-scripts\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.881239 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-combined-ca-bundle\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.909690 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-public-tls-certs\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.914501 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn74s\" (UniqueName: \"kubernetes.io/projected/768d31aa-5226-4119-b1d2-f66786c695f1-kube-api-access-wn74s\") pod \"placement-774b97d54d-8xt86\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.980056 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.980405 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:03 crc kubenswrapper[4725]: I0227 06:32:03.998400 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:04 crc kubenswrapper[4725]: I0227 06:32:04.016406 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:04 crc kubenswrapper[4725]: I0227 06:32:04.016710 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:04 crc kubenswrapper[4725]: I0227 06:32:04.036812 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:05 crc kubenswrapper[4725]: I0227 06:32:05.024071 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:05 crc kubenswrapper[4725]: I0227 06:32:05.024993 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="a4dc45b9-7d75-443c-8712-44dca955b02d" containerName="watcher-applier" containerID="cri-o://e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" gracePeriod=30 Feb 27 06:32:05 crc kubenswrapper[4725]: I0227 06:32:05.951431 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:32:06 crc kubenswrapper[4725]: I0227 06:32:06.033068 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:32:06 crc kubenswrapper[4725]: I0227 06:32:06.053106 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7767444847-ftb89"] Feb 27 06:32:06 crc kubenswrapper[4725]: I0227 06:32:06.053342 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7767444847-ftb89" podUID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerName="dnsmasq-dns" containerID="cri-o://e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd" gracePeriod=10 Feb 27 06:32:06 crc kubenswrapper[4725]: I0227 06:32:06.738573 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 06:32:06 crc kubenswrapper[4725]: I0227 06:32:06.738832 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 06:32:06 crc kubenswrapper[4725]: I0227 06:32:06.799503 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 06:32:06 crc kubenswrapper[4725]: I0227 06:32:06.812739 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.030888 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.066327 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.073679 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c7fnv" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.073686 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c7fnv" event={"ID":"9ab5820b-1151-4cc9-ae7b-09b596335d88","Type":"ContainerDied","Data":"1a34d75dfde83df1eabb1279409a2db854e70c60f91202ea890bf0406e7670d0"} Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.074399 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a34d75dfde83df1eabb1279409a2db854e70c60f91202ea890bf0406e7670d0" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.091887 4725 generic.go:334] "Generic (PLEG): container finished" podID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerID="e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd" exitCode=0 Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.092259 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7767444847-ftb89" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.093151 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7767444847-ftb89" event={"ID":"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f","Type":"ContainerDied","Data":"e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd"} Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.093184 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7767444847-ftb89" event={"ID":"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f","Type":"ContainerDied","Data":"b1a5e717f6c8651f57baa790c087c49ff6e4361ad1c344f6c024b6c13b5f48dd"} Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.093203 4725 scope.go:117] "RemoveContainer" containerID="e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.106033 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536232-8xqk7" event={"ID":"5976615b-8ad2-4d95-9702-5b003064ee5c","Type":"ContainerStarted","Data":"4de6ab32c242b713117986cac707903928f3ef87a68c933521380662e0846c85"} Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.106066 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.106162 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.142956 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-svc\") pod \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143013 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-fernet-keys\") pod \"9ab5820b-1151-4cc9-ae7b-09b596335d88\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143047 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-config-data\") pod \"9ab5820b-1151-4cc9-ae7b-09b596335d88\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143112 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbw5d\" (UniqueName: \"kubernetes.io/projected/9ab5820b-1151-4cc9-ae7b-09b596335d88-kube-api-access-kbw5d\") pod \"9ab5820b-1151-4cc9-ae7b-09b596335d88\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143201 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-sb\") pod \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143226 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-swift-storage-0\") pod \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143326 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-combined-ca-bundle\") pod \"9ab5820b-1151-4cc9-ae7b-09b596335d88\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143388 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-nb\") pod \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143429 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-scripts\") pod \"9ab5820b-1151-4cc9-ae7b-09b596335d88\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143473 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-credential-keys\") pod \"9ab5820b-1151-4cc9-ae7b-09b596335d88\" (UID: \"9ab5820b-1151-4cc9-ae7b-09b596335d88\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143502 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb22r\" (UniqueName: \"kubernetes.io/projected/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-kube-api-access-gb22r\") pod \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.143558 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-config\") pod \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\" (UID: \"88e9fc59-6fe2-4d72-a256-a13e83a9ea9f\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.149493 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab5820b-1151-4cc9-ae7b-09b596335d88-kube-api-access-kbw5d" (OuterVolumeSpecName: "kube-api-access-kbw5d") pod "9ab5820b-1151-4cc9-ae7b-09b596335d88" (UID: "9ab5820b-1151-4cc9-ae7b-09b596335d88"). InnerVolumeSpecName "kube-api-access-kbw5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.164156 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-scripts" (OuterVolumeSpecName: "scripts") pod "9ab5820b-1151-4cc9-ae7b-09b596335d88" (UID: "9ab5820b-1151-4cc9-ae7b-09b596335d88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.171280 4725 scope.go:117] "RemoveContainer" containerID="6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.171305 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9ab5820b-1151-4cc9-ae7b-09b596335d88" (UID: "9ab5820b-1151-4cc9-ae7b-09b596335d88"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.179506 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-kube-api-access-gb22r" (OuterVolumeSpecName: "kube-api-access-gb22r") pod "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" (UID: "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f"). InnerVolumeSpecName "kube-api-access-gb22r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.184305 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9ab5820b-1151-4cc9-ae7b-09b596335d88" (UID: "9ab5820b-1151-4cc9-ae7b-09b596335d88"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.222109 4725 scope.go:117] "RemoveContainer" containerID="e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd" Feb 27 06:32:07 crc kubenswrapper[4725]: E0227 06:32:07.222637 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd\": container with ID starting with e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd not found: ID does not exist" containerID="e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.222670 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd"} err="failed to get container status \"e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd\": rpc error: code = NotFound desc = could not find container \"e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd\": container with ID starting with e77c703640f015f3e8b9b808bb02476afa9c32855c04dae047f0dacc72549fcd not found: ID does not exist" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.222690 4725 scope.go:117] "RemoveContainer" containerID="6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200" Feb 27 06:32:07 crc kubenswrapper[4725]: E0227 06:32:07.223263 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200\": container with ID starting with 6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200 not found: ID does not exist" containerID="6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.223294 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200"} err="failed to get container status \"6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200\": rpc error: code = NotFound desc = could not find container \"6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200\": container with ID starting with 6538197a7f8d90e9e1b9035ece3375947c15a64bdf0df832ce2d0fa56fb8b200 not found: ID does not exist" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.236923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ab5820b-1151-4cc9-ae7b-09b596335d88" (UID: "9ab5820b-1151-4cc9-ae7b-09b596335d88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.247596 4725 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.247633 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbw5d\" (UniqueName: \"kubernetes.io/projected/9ab5820b-1151-4cc9-ae7b-09b596335d88-kube-api-access-kbw5d\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.247648 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.247660 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.247671 4725 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.247683 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb22r\" (UniqueName: \"kubernetes.io/projected/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-kube-api-access-gb22r\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.344484 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-774b97d54d-8xt86"] Feb 27 06:32:07 crc kubenswrapper[4725]: E0227 06:32:07.389324 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7 is running failed: container process not found" containerID="e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.394334 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-config" (OuterVolumeSpecName: "config") pod "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" (UID: "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: E0227 06:32:07.401151 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7 is running failed: container process not found" containerID="e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 27 06:32:07 crc kubenswrapper[4725]: E0227 06:32:07.401724 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7 is running failed: container process not found" containerID="e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 27 06:32:07 crc kubenswrapper[4725]: E0227 06:32:07.401756 4725 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="a4dc45b9-7d75-443c-8712-44dca955b02d" containerName="watcher-applier" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.404386 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-config-data" (OuterVolumeSpecName: "config-data") pod "9ab5820b-1151-4cc9-ae7b-09b596335d88" (UID: "9ab5820b-1151-4cc9-ae7b-09b596335d88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.420698 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" (UID: "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.444596 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" (UID: "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.451941 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.451968 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.451977 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab5820b-1151-4cc9-ae7b-09b596335d88-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.451987 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.453854 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" (UID: "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.471819 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" (UID: "88e9fc59-6fe2-4d72-a256-a13e83a9ea9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.518530 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.537834 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.554096 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.554126 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.929368 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7767444847-ftb89"] Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.933731 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.970515 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-config-data\") pod \"a4dc45b9-7d75-443c-8712-44dca955b02d\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.970567 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-combined-ca-bundle\") pod \"a4dc45b9-7d75-443c-8712-44dca955b02d\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.970588 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4dc45b9-7d75-443c-8712-44dca955b02d-logs\") pod \"a4dc45b9-7d75-443c-8712-44dca955b02d\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.970617 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm9vq\" (UniqueName: \"kubernetes.io/projected/a4dc45b9-7d75-443c-8712-44dca955b02d-kube-api-access-sm9vq\") pod \"a4dc45b9-7d75-443c-8712-44dca955b02d\" (UID: \"a4dc45b9-7d75-443c-8712-44dca955b02d\") " Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.976639 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dc45b9-7d75-443c-8712-44dca955b02d-kube-api-access-sm9vq" (OuterVolumeSpecName: "kube-api-access-sm9vq") pod "a4dc45b9-7d75-443c-8712-44dca955b02d" (UID: "a4dc45b9-7d75-443c-8712-44dca955b02d"). InnerVolumeSpecName "kube-api-access-sm9vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.992987 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4dc45b9-7d75-443c-8712-44dca955b02d-logs" (OuterVolumeSpecName: "logs") pod "a4dc45b9-7d75-443c-8712-44dca955b02d" (UID: "a4dc45b9-7d75-443c-8712-44dca955b02d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:07 crc kubenswrapper[4725]: I0227 06:32:07.997878 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7767444847-ftb89"] Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.072065 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4dc45b9-7d75-443c-8712-44dca955b02d-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.072092 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm9vq\" (UniqueName: \"kubernetes.io/projected/a4dc45b9-7d75-443c-8712-44dca955b02d-kube-api-access-sm9vq\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.078461 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4dc45b9-7d75-443c-8712-44dca955b02d" (UID: "a4dc45b9-7d75-443c-8712-44dca955b02d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.124562 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-config-data" (OuterVolumeSpecName: "config-data") pod "a4dc45b9-7d75-443c-8712-44dca955b02d" (UID: "a4dc45b9-7d75-443c-8712-44dca955b02d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.177455 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.177489 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc45b9-7d75-443c-8712-44dca955b02d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.197817 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6fccbd6487-2trpv"] Feb 27 06:32:08 crc kubenswrapper[4725]: E0227 06:32:08.198186 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dc45b9-7d75-443c-8712-44dca955b02d" containerName="watcher-applier" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.198197 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dc45b9-7d75-443c-8712-44dca955b02d" containerName="watcher-applier" Feb 27 06:32:08 crc kubenswrapper[4725]: E0227 06:32:08.198213 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab5820b-1151-4cc9-ae7b-09b596335d88" containerName="keystone-bootstrap" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.198219 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab5820b-1151-4cc9-ae7b-09b596335d88" containerName="keystone-bootstrap" Feb 27 06:32:08 crc kubenswrapper[4725]: E0227 06:32:08.198233 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerName="dnsmasq-dns" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.198239 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerName="dnsmasq-dns" Feb 27 06:32:08 crc kubenswrapper[4725]: E0227 06:32:08.198253 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerName="init" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.198259 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerName="init" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.198447 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" containerName="dnsmasq-dns" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.198460 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab5820b-1151-4cc9-ae7b-09b596335d88" containerName="keystone-bootstrap" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.198471 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4dc45b9-7d75-443c-8712-44dca955b02d" containerName="watcher-applier" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.199034 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.202470 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b44778b65-pk2td" event={"ID":"4598bdbc-f18b-4709-baa9-013d097a4dfc","Type":"ContainerStarted","Data":"2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.203395 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.208026 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.208371 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4dkt" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.208559 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.208725 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.208889 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.209053 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.225383 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6fccbd6487-2trpv"] Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.238553 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerStarted","Data":"768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.279617 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-scripts\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.279755 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-internal-tls-certs\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.279788 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-fernet-keys\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.279814 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-public-tls-certs\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.279901 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-config-data\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.279925 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42x8s\" (UniqueName: \"kubernetes.io/projected/c5d7d934-34b3-46a7-94d2-0803780d5837-kube-api-access-42x8s\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.279956 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-combined-ca-bundle\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.280042 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-credential-keys\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.324669 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b44778b65-pk2td" podStartSLOduration=10.324652507 podStartE2EDuration="10.324652507s" podCreationTimestamp="2026-02-27 06:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:08.276198094 +0000 UTC m=+1306.738818663" watchObservedRunningTime="2026-02-27 06:32:08.324652507 +0000 UTC m=+1306.787273076" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.327456 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e9fc59-6fe2-4d72-a256-a13e83a9ea9f" path="/var/lib/kubelet/pods/88e9fc59-6fe2-4d72-a256-a13e83a9ea9f/volumes" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.330301 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerStarted","Data":"bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.330343 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774b97d54d-8xt86" event={"ID":"768d31aa-5226-4119-b1d2-f66786c695f1","Type":"ContainerStarted","Data":"cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.330357 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774b97d54d-8xt86" event={"ID":"768d31aa-5226-4119-b1d2-f66786c695f1","Type":"ContainerStarted","Data":"8d250688552b6c562756bc0ef1ab465b89bad976968887cb0cd4b2b350054400"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.352064 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zb7cc" event={"ID":"617c62fd-dee8-4bab-a69a-8f348c8487a3","Type":"ContainerStarted","Data":"4f405d997ffaff8df8f72728e5663cade2aff7b899d3f9ed93a8cb96bbd91b7d"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.381565 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-config-data\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.381819 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42x8s\" (UniqueName: \"kubernetes.io/projected/c5d7d934-34b3-46a7-94d2-0803780d5837-kube-api-access-42x8s\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.381903 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-combined-ca-bundle\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.382015 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-credential-keys\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.382098 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-scripts\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.382203 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-internal-tls-certs\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.382275 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-fernet-keys\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.385783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-public-tls-certs\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.385675 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zb7cc" podStartSLOduration=3.31779555 podStartE2EDuration="46.385657632s" podCreationTimestamp="2026-02-27 06:31:22 +0000 UTC" firstStartedPulling="2026-02-27 06:31:24.336388229 +0000 UTC m=+1262.799008798" lastFinishedPulling="2026-02-27 06:32:07.404250311 +0000 UTC m=+1305.866870880" observedRunningTime="2026-02-27 06:32:08.380421285 +0000 UTC m=+1306.843041854" watchObservedRunningTime="2026-02-27 06:32:08.385657632 +0000 UTC m=+1306.848278201" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.397431 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-combined-ca-bundle\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.397594 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-public-tls-certs\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.397901 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-scripts\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.398681 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-config-data\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.404280 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-fernet-keys\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.404631 4725 generic.go:334] "Generic (PLEG): container finished" podID="a4dc45b9-7d75-443c-8712-44dca955b02d" containerID="e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" exitCode=0 Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.405764 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.406389 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a4dc45b9-7d75-443c-8712-44dca955b02d","Type":"ContainerDied","Data":"e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.406417 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a4dc45b9-7d75-443c-8712-44dca955b02d","Type":"ContainerDied","Data":"a6ba534cc3da1867c6f02806239f2a692feed875bc3e0e8126850f5e389e100c"} Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.406431 4725 scope.go:117] "RemoveContainer" containerID="e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.408105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-internal-tls-certs\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.409924 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5d7d934-34b3-46a7-94d2-0803780d5837-credential-keys\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.416863 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42x8s\" (UniqueName: \"kubernetes.io/projected/c5d7d934-34b3-46a7-94d2-0803780d5837-kube-api-access-42x8s\") pod \"keystone-6fccbd6487-2trpv\" (UID: \"c5d7d934-34b3-46a7-94d2-0803780d5837\") " pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.442361 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.457521 4725 scope.go:117] "RemoveContainer" containerID="e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" Feb 27 06:32:08 crc kubenswrapper[4725]: E0227 06:32:08.465085 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7\": container with ID starting with e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7 not found: ID does not exist" containerID="e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.465132 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7"} err="failed to get container status \"e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7\": rpc error: code = NotFound desc = could not find container \"e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7\": container with ID starting with e1f841e624f86d37f6ac2cf00492aa7c3baed872ab667d05dacc752203b99ba7 not found: ID does not exist" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.469393 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.496166 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.547063 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.547402 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.566021 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.607716 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.696273 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-logs\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.696450 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlhvj\" (UniqueName: \"kubernetes.io/projected/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-kube-api-access-qlhvj\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.696471 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-config-data\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.696510 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.798110 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlhvj\" (UniqueName: \"kubernetes.io/projected/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-kube-api-access-qlhvj\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.798154 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-config-data\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.798199 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.798279 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-logs\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.798711 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-logs\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.825353 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlhvj\" (UniqueName: \"kubernetes.io/projected/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-kube-api-access-qlhvj\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.825394 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.825867 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-config-data\") pod \"watcher-applier-0\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " pod="openstack/watcher-applier-0" Feb 27 06:32:08 crc kubenswrapper[4725]: I0227 06:32:08.891756 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.222277 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6fccbd6487-2trpv"] Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.371898 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.371961 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.468676 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774b97d54d-8xt86" event={"ID":"768d31aa-5226-4119-b1d2-f66786c695f1","Type":"ContainerStarted","Data":"cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9"} Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.468966 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.468989 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.518254 4725 generic.go:334] "Generic (PLEG): container finished" podID="5976615b-8ad2-4d95-9702-5b003064ee5c" containerID="eb26b6f9487a111ec10255aa47f4c6aedd16d7e02c896d1ff37e665ab035efa5" exitCode=0 Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.518356 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536232-8xqk7" event={"ID":"5976615b-8ad2-4d95-9702-5b003064ee5c","Type":"ContainerDied","Data":"eb26b6f9487a111ec10255aa47f4c6aedd16d7e02c896d1ff37e665ab035efa5"} Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.545350 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.545543 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.545349 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6fccbd6487-2trpv" event={"ID":"c5d7d934-34b3-46a7-94d2-0803780d5837","Type":"ContainerStarted","Data":"56fbfee4c01c133964a60d9583f1734789653786e650ede1177795c86fd4bc23"} Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.561172 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-774b97d54d-8xt86" podStartSLOduration=6.561133032 podStartE2EDuration="6.561133032s" podCreationTimestamp="2026-02-27 06:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:09.505757334 +0000 UTC m=+1307.968377933" watchObservedRunningTime="2026-02-27 06:32:09.561133032 +0000 UTC m=+1308.023753601" Feb 27 06:32:09 crc kubenswrapper[4725]: I0227 06:32:09.612911 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.267098 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dc45b9-7d75-443c-8712-44dca955b02d" path="/var/lib/kubelet/pods/a4dc45b9-7d75-443c-8712-44dca955b02d/volumes" Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.564963 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"c42fc5be-c4bc-4ebc-8604-8d088212fbb5","Type":"ContainerStarted","Data":"f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2"} Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.565010 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"c42fc5be-c4bc-4ebc-8604-8d088212fbb5","Type":"ContainerStarted","Data":"ef206f7fe3e449dd1ea9f4a2a81ed9b86e2d2ce31e52a5a6c3792422738668fd"} Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.569170 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6fccbd6487-2trpv" event={"ID":"c5d7d934-34b3-46a7-94d2-0803780d5837","Type":"ContainerStarted","Data":"649c852aa1a37471f7796ef545f37c15cb287d4a349ca469f80e92f7b993eb44"} Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.569489 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.584827 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.585038 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api-log" containerID="cri-o://642bae2deabf590735398bfe29ed50f4cc91acbbeff8f43348da8d07230674db" gracePeriod=30 Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.585164 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api" containerID="cri-o://aec86448eabc75a84f0a93a84c8ad2c642a8946ca752fa9c488a9ec4aabdca28" gracePeriod=30 Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.595836 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.595823511 podStartE2EDuration="2.595823511s" podCreationTimestamp="2026-02-27 06:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:10.592436845 +0000 UTC m=+1309.055057414" watchObservedRunningTime="2026-02-27 06:32:10.595823511 +0000 UTC m=+1309.058444080" Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.633462 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6fccbd6487-2trpv" podStartSLOduration=2.633445249 podStartE2EDuration="2.633445249s" podCreationTimestamp="2026-02-27 06:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:10.617617064 +0000 UTC m=+1309.080237633" watchObservedRunningTime="2026-02-27 06:32:10.633445249 +0000 UTC m=+1309.096065818" Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.957347 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 06:32:10 crc kubenswrapper[4725]: I0227 06:32:10.957424 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.191926 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536232-8xqk7" Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.261099 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjbjr\" (UniqueName: \"kubernetes.io/projected/5976615b-8ad2-4d95-9702-5b003064ee5c-kube-api-access-hjbjr\") pod \"5976615b-8ad2-4d95-9702-5b003064ee5c\" (UID: \"5976615b-8ad2-4d95-9702-5b003064ee5c\") " Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.267785 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5976615b-8ad2-4d95-9702-5b003064ee5c-kube-api-access-hjbjr" (OuterVolumeSpecName: "kube-api-access-hjbjr") pod "5976615b-8ad2-4d95-9702-5b003064ee5c" (UID: "5976615b-8ad2-4d95-9702-5b003064ee5c"). InnerVolumeSpecName "kube-api-access-hjbjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.363542 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjbjr\" (UniqueName: \"kubernetes.io/projected/5976615b-8ad2-4d95-9702-5b003064ee5c-kube-api-access-hjbjr\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.402499 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.582570 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536232-8xqk7" Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.584461 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536232-8xqk7" event={"ID":"5976615b-8ad2-4d95-9702-5b003064ee5c","Type":"ContainerDied","Data":"4de6ab32c242b713117986cac707903928f3ef87a68c933521380662e0846c85"} Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.584522 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de6ab32c242b713117986cac707903928f3ef87a68c933521380662e0846c85" Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.594111 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5v8pq" event={"ID":"6a306be2-547c-404f-afc1-4f4639cf7a28","Type":"ContainerStarted","Data":"84f332f8690913a03ca370eeaa881c97db72381090f14bef26d6fa16a82344ff"} Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.598902 4725 generic.go:334] "Generic (PLEG): container finished" podID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerID="642bae2deabf590735398bfe29ed50f4cc91acbbeff8f43348da8d07230674db" exitCode=143 Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.599890 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"852f1c61-8d84-42fd-bed8-be55f65b3a4c","Type":"ContainerDied","Data":"642bae2deabf590735398bfe29ed50f4cc91acbbeff8f43348da8d07230674db"} Feb 27 06:32:11 crc kubenswrapper[4725]: I0227 06:32:11.647421 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-5v8pq" podStartSLOduration=4.770179967 podStartE2EDuration="49.647400876s" podCreationTimestamp="2026-02-27 06:31:22 +0000 UTC" firstStartedPulling="2026-02-27 06:31:24.52522844 +0000 UTC m=+1262.987848999" lastFinishedPulling="2026-02-27 06:32:09.402449339 +0000 UTC m=+1307.865069908" observedRunningTime="2026-02-27 06:32:11.638701381 +0000 UTC m=+1310.101321970" watchObservedRunningTime="2026-02-27 06:32:11.647400876 +0000 UTC m=+1310.110021455" Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.272562 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536226-hf8lc"] Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.283806 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536226-hf8lc"] Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.492145 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.493106 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": dial tcp 10.217.0.167:9322: connect: connection refused" Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.493501 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": dial tcp 10.217.0.167:9322: connect: connection refused" Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.548613 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.613486 4725 generic.go:334] "Generic (PLEG): container finished" podID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerID="aec86448eabc75a84f0a93a84c8ad2c642a8946ca752fa9c488a9ec4aabdca28" exitCode=0 Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.614656 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"852f1c61-8d84-42fd-bed8-be55f65b3a4c","Type":"ContainerDied","Data":"aec86448eabc75a84f0a93a84c8ad2c642a8946ca752fa9c488a9ec4aabdca28"} Feb 27 06:32:12 crc kubenswrapper[4725]: I0227 06:32:12.614697 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:12 crc kubenswrapper[4725]: E0227 06:32:12.670580 4725 log.go:32] "ExecSync cmd from runtime service failed" err=< Feb 27 06:32:12 crc kubenswrapper[4725]: rpc error: code = Unknown desc = command error: setns `mnt`: Bad file descriptor Feb 27 06:32:12 crc kubenswrapper[4725]: fail startup Feb 27 06:32:12 crc kubenswrapper[4725]: , stdout: , stderr: , exit code -1 Feb 27 06:32:12 crc kubenswrapper[4725]: > containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 27 06:32:12 crc kubenswrapper[4725]: E0227 06:32:12.671006 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e is running failed: container process not found" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 27 06:32:12 crc kubenswrapper[4725]: E0227 06:32:12.671324 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e is running failed: container process not found" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Feb 27 06:32:12 crc kubenswrapper[4725]: E0227 06:32:12.671355 4725 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.005454 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.204152 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852f1c61-8d84-42fd-bed8-be55f65b3a4c-logs\") pod \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.204242 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9458\" (UniqueName: \"kubernetes.io/projected/852f1c61-8d84-42fd-bed8-be55f65b3a4c-kube-api-access-v9458\") pod \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.204308 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-config-data\") pod \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.204350 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-combined-ca-bundle\") pod \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.204379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-custom-prometheus-ca\") pod \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\" (UID: \"852f1c61-8d84-42fd-bed8-be55f65b3a4c\") " Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.205865 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/852f1c61-8d84-42fd-bed8-be55f65b3a4c-logs" (OuterVolumeSpecName: "logs") pod "852f1c61-8d84-42fd-bed8-be55f65b3a4c" (UID: "852f1c61-8d84-42fd-bed8-be55f65b3a4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.214679 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852f1c61-8d84-42fd-bed8-be55f65b3a4c-kube-api-access-v9458" (OuterVolumeSpecName: "kube-api-access-v9458") pod "852f1c61-8d84-42fd-bed8-be55f65b3a4c" (UID: "852f1c61-8d84-42fd-bed8-be55f65b3a4c"). InnerVolumeSpecName "kube-api-access-v9458". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.255007 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "852f1c61-8d84-42fd-bed8-be55f65b3a4c" (UID: "852f1c61-8d84-42fd-bed8-be55f65b3a4c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.255742 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "852f1c61-8d84-42fd-bed8-be55f65b3a4c" (UID: "852f1c61-8d84-42fd-bed8-be55f65b3a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.296197 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-config-data" (OuterVolumeSpecName: "config-data") pod "852f1c61-8d84-42fd-bed8-be55f65b3a4c" (UID: "852f1c61-8d84-42fd-bed8-be55f65b3a4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.306654 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.306696 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.306713 4725 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/852f1c61-8d84-42fd-bed8-be55f65b3a4c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.306725 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852f1c61-8d84-42fd-bed8-be55f65b3a4c-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.306736 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9458\" (UniqueName: \"kubernetes.io/projected/852f1c61-8d84-42fd-bed8-be55f65b3a4c-kube-api-access-v9458\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.639355 4725 generic.go:334] "Generic (PLEG): container finished" podID="617c62fd-dee8-4bab-a69a-8f348c8487a3" containerID="4f405d997ffaff8df8f72728e5663cade2aff7b899d3f9ed93a8cb96bbd91b7d" exitCode=0 Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.639432 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zb7cc" event={"ID":"617c62fd-dee8-4bab-a69a-8f348c8487a3","Type":"ContainerDied","Data":"4f405d997ffaff8df8f72728e5663cade2aff7b899d3f9ed93a8cb96bbd91b7d"} Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.643264 4725 generic.go:334] "Generic (PLEG): container finished" podID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" exitCode=1 Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.643329 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerDied","Data":"768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e"} Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.643353 4725 scope.go:117] "RemoveContainer" containerID="e361e5e717fb7489267b59facc66846148c3a089296f8966f2c85e3d4c673f27" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.643939 4725 scope.go:117] "RemoveContainer" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" Feb 27 06:32:13 crc kubenswrapper[4725]: E0227 06:32:13.644156 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(98e9aa25-5670-466b-92a2-26b711b3ccf4)\"" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.647074 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"852f1c61-8d84-42fd-bed8-be55f65b3a4c","Type":"ContainerDied","Data":"f6a7b38351e84f511258efcd42c3df7ede9f73d18816212b096436cedfb57e81"} Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.647141 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.729316 4725 scope.go:117] "RemoveContainer" containerID="aec86448eabc75a84f0a93a84c8ad2c642a8946ca752fa9c488a9ec4aabdca28" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.759120 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.774701 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.786142 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:32:13 crc kubenswrapper[4725]: E0227 06:32:13.786697 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.786720 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api" Feb 27 06:32:13 crc kubenswrapper[4725]: E0227 06:32:13.786742 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api-log" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.786751 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api-log" Feb 27 06:32:13 crc kubenswrapper[4725]: E0227 06:32:13.786779 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5976615b-8ad2-4d95-9702-5b003064ee5c" containerName="oc" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.786788 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5976615b-8ad2-4d95-9702-5b003064ee5c" containerName="oc" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.787057 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api-log" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.787094 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5976615b-8ad2-4d95-9702-5b003064ee5c" containerName="oc" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.787110 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" containerName="watcher-api" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.788196 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.794731 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.794919 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.795826 4725 scope.go:117] "RemoveContainer" containerID="642bae2deabf590735398bfe29ed50f4cc91acbbeff8f43348da8d07230674db" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.798203 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.804552 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.821141 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.821273 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl2fg\" (UniqueName: \"kubernetes.io/projected/c3439aa5-3edd-49b2-8d83-5a34cd55764b-kube-api-access-dl2fg\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.821503 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.821674 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.821725 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.821759 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3439aa5-3edd-49b2-8d83-5a34cd55764b-logs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.821786 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-config-data\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.892681 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.896393 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.923569 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.923683 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.923712 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.923733 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3439aa5-3edd-49b2-8d83-5a34cd55764b-logs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.923754 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-config-data\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.923801 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.923822 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl2fg\" (UniqueName: \"kubernetes.io/projected/c3439aa5-3edd-49b2-8d83-5a34cd55764b-kube-api-access-dl2fg\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.931057 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3439aa5-3edd-49b2-8d83-5a34cd55764b-logs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.932955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.933159 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-config-data\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.933399 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.933984 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.936124 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.946142 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:32:13 crc kubenswrapper[4725]: I0227 06:32:13.953998 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl2fg\" (UniqueName: \"kubernetes.io/projected/c3439aa5-3edd-49b2-8d83-5a34cd55764b-kube-api-access-dl2fg\") pod \"watcher-api-0\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " pod="openstack/watcher-api-0" Feb 27 06:32:14 crc kubenswrapper[4725]: I0227 06:32:14.128564 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:32:14 crc kubenswrapper[4725]: I0227 06:32:14.266513 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852f1c61-8d84-42fd-bed8-be55f65b3a4c" path="/var/lib/kubelet/pods/852f1c61-8d84-42fd-bed8-be55f65b3a4c/volumes" Feb 27 06:32:14 crc kubenswrapper[4725]: I0227 06:32:14.267411 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4efe9d9-892b-4097-80f3-4d95d83f52fe" path="/var/lib/kubelet/pods/e4efe9d9-892b-4097-80f3-4d95d83f52fe/volumes" Feb 27 06:32:14 crc kubenswrapper[4725]: I0227 06:32:14.660527 4725 scope.go:117] "RemoveContainer" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" Feb 27 06:32:14 crc kubenswrapper[4725]: E0227 06:32:14.660756 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(98e9aa25-5670-466b-92a2-26b711b3ccf4)\"" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" Feb 27 06:32:15 crc kubenswrapper[4725]: I0227 06:32:15.785311 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7f478fcd58-cfjzp" Feb 27 06:32:15 crc kubenswrapper[4725]: I0227 06:32:15.845267 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56865cdb4-9hs85"] Feb 27 06:32:15 crc kubenswrapper[4725]: I0227 06:32:15.845480 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-56865cdb4-9hs85" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon-log" containerID="cri-o://0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370" gracePeriod=30 Feb 27 06:32:15 crc kubenswrapper[4725]: I0227 06:32:15.845838 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-56865cdb4-9hs85" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" containerID="cri-o://07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998" gracePeriod=30 Feb 27 06:32:15 crc kubenswrapper[4725]: I0227 06:32:15.852431 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56865cdb4-9hs85" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Feb 27 06:32:16 crc kubenswrapper[4725]: I0227 06:32:16.593146 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56865cdb4-9hs85" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:37760->10.217.0.168:8443: read: connection reset by peer" Feb 27 06:32:17 crc kubenswrapper[4725]: I0227 06:32:17.711101 4725 generic.go:334] "Generic (PLEG): container finished" podID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerID="07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998" exitCode=0 Feb 27 06:32:17 crc kubenswrapper[4725]: I0227 06:32:17.711280 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56865cdb4-9hs85" event={"ID":"a4fc5fc3-880a-46c5-a0a1-3248884d9882","Type":"ContainerDied","Data":"07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998"} Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.693471 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.738001 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-combined-ca-bundle\") pod \"617c62fd-dee8-4bab-a69a-8f348c8487a3\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.738098 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-db-sync-config-data\") pod \"617c62fd-dee8-4bab-a69a-8f348c8487a3\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.738241 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl8fj\" (UniqueName: \"kubernetes.io/projected/617c62fd-dee8-4bab-a69a-8f348c8487a3-kube-api-access-wl8fj\") pod \"617c62fd-dee8-4bab-a69a-8f348c8487a3\" (UID: \"617c62fd-dee8-4bab-a69a-8f348c8487a3\") " Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.752618 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617c62fd-dee8-4bab-a69a-8f348c8487a3-kube-api-access-wl8fj" (OuterVolumeSpecName: "kube-api-access-wl8fj") pod "617c62fd-dee8-4bab-a69a-8f348c8487a3" (UID: "617c62fd-dee8-4bab-a69a-8f348c8487a3"). InnerVolumeSpecName "kube-api-access-wl8fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.761619 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "617c62fd-dee8-4bab-a69a-8f348c8487a3" (UID: "617c62fd-dee8-4bab-a69a-8f348c8487a3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.774694 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zb7cc" event={"ID":"617c62fd-dee8-4bab-a69a-8f348c8487a3","Type":"ContainerDied","Data":"e0441b3c16d6938635ab6818defdf1ee284cbbcf64407ed5e334948d02c0264b"} Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.774746 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0441b3c16d6938635ab6818defdf1ee284cbbcf64407ed5e334948d02c0264b" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.774818 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zb7cc" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.812857 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "617c62fd-dee8-4bab-a69a-8f348c8487a3" (UID: "617c62fd-dee8-4bab-a69a-8f348c8487a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.840678 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl8fj\" (UniqueName: \"kubernetes.io/projected/617c62fd-dee8-4bab-a69a-8f348c8487a3-kube-api-access-wl8fj\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.840704 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.840712 4725 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/617c62fd-dee8-4bab-a69a-8f348c8487a3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.892553 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 27 06:32:18 crc kubenswrapper[4725]: I0227 06:32:18.918500 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 27 06:32:19 crc kubenswrapper[4725]: E0227 06:32:19.761476 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.784502 4725 generic.go:334] "Generic (PLEG): container finished" podID="6a306be2-547c-404f-afc1-4f4639cf7a28" containerID="84f332f8690913a03ca370eeaa881c97db72381090f14bef26d6fa16a82344ff" exitCode=0 Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.784570 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5v8pq" event={"ID":"6a306be2-547c-404f-afc1-4f4639cf7a28","Type":"ContainerDied","Data":"84f332f8690913a03ca370eeaa881c97db72381090f14bef26d6fa16a82344ff"} Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.788230 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerStarted","Data":"d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3"} Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.788825 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="ceilometer-notification-agent" containerID="cri-o://8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff" gracePeriod=30 Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.788849 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="proxy-httpd" containerID="cri-o://d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3" gracePeriod=30 Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.788857 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="sg-core" containerID="cri-o://bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1" gracePeriod=30 Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.825535 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 27 06:32:19 crc kubenswrapper[4725]: I0227 06:32:19.890833 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:32:19 crc kubenswrapper[4725]: W0227 06:32:19.920484 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3439aa5_3edd_49b2_8d83_5a34cd55764b.slice/crio-214e5b69dfd0a1b31b2603e521a20befd9ab7f0d0f617a81700e2785c2be79ed WatchSource:0}: Error finding container 214e5b69dfd0a1b31b2603e521a20befd9ab7f0d0f617a81700e2785c2be79ed: Status 404 returned error can't find the container with id 214e5b69dfd0a1b31b2603e521a20befd9ab7f0d0f617a81700e2785c2be79ed Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.016212 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64b777b644-7s9mh"] Feb 27 06:32:20 crc kubenswrapper[4725]: E0227 06:32:20.016780 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617c62fd-dee8-4bab-a69a-8f348c8487a3" containerName="barbican-db-sync" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.016857 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="617c62fd-dee8-4bab-a69a-8f348c8487a3" containerName="barbican-db-sync" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.017104 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="617c62fd-dee8-4bab-a69a-8f348c8487a3" containerName="barbican-db-sync" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.018137 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.036778 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.036953 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.037063 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-98jzd" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.040799 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-545fff646c-dqt5j"] Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.042379 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.049697 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.064487 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-545fff646c-dqt5j"] Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.099890 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64b777b644-7s9mh"] Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.146367 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57f4fb9dc-2h8p9"] Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.147764 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.155164 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57f4fb9dc-2h8p9"] Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.166732 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgtt\" (UniqueName: \"kubernetes.io/projected/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-kube-api-access-hhgtt\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.166787 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-config-data\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.166950 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4edb4e2-0feb-4075-a823-c02d954872d3-logs\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.167010 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-combined-ca-bundle\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.167033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-config-data\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.167071 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s68f2\" (UniqueName: \"kubernetes.io/projected/e4edb4e2-0feb-4075-a823-c02d954872d3-kube-api-access-s68f2\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.167092 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-logs\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.167137 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-config-data-custom\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.167167 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-combined-ca-bundle\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.167188 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-config-data-custom\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.269851 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgtt\" (UniqueName: \"kubernetes.io/projected/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-kube-api-access-hhgtt\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.269911 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-config-data\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.269953 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-sb\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.269994 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-nb\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270021 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4edb4e2-0feb-4075-a823-c02d954872d3-logs\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270063 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-combined-ca-bundle\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270086 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-config-data\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270122 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68f2\" (UniqueName: \"kubernetes.io/projected/e4edb4e2-0feb-4075-a823-c02d954872d3-kube-api-access-s68f2\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270146 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-logs\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270180 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-config\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270210 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-swift-storage-0\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270240 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-config-data-custom\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270271 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nzm9\" (UniqueName: \"kubernetes.io/projected/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-kube-api-access-2nzm9\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270314 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-combined-ca-bundle\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270338 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-config-data-custom\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.270368 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-svc\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.275536 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-config-data\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.275859 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4edb4e2-0feb-4075-a823-c02d954872d3-logs\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.279468 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-combined-ca-bundle\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.286001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-logs\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.299671 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-config-data\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.300120 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-config-data-custom\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.311904 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgtt\" (UniqueName: \"kubernetes.io/projected/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-kube-api-access-hhgtt\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.313839 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68f2\" (UniqueName: \"kubernetes.io/projected/e4edb4e2-0feb-4075-a823-c02d954872d3-kube-api-access-s68f2\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.320225 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8547558994-qzqlz"] Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.322836 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.324811 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.348581 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8547558994-qzqlz"] Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.349399 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d495df58-14fc-4eb9-a8f1-104b6ca6ce22-combined-ca-bundle\") pod \"barbican-keystone-listener-64b777b644-7s9mh\" (UID: \"d495df58-14fc-4eb9-a8f1-104b6ca6ce22\") " pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.350611 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4edb4e2-0feb-4075-a823-c02d954872d3-config-data-custom\") pod \"barbican-worker-545fff646c-dqt5j\" (UID: \"e4edb4e2-0feb-4075-a823-c02d954872d3\") " pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.372123 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-config\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.372178 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-swift-storage-0\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.372208 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nzm9\" (UniqueName: \"kubernetes.io/projected/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-kube-api-access-2nzm9\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.372242 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-svc\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.372349 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-sb\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.372388 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-nb\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.373522 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-config\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.373961 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-sb\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.374148 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-nb\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.374278 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-svc\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.374601 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-swift-storage-0\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.390919 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nzm9\" (UniqueName: \"kubernetes.io/projected/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-kube-api-access-2nzm9\") pod \"dnsmasq-dns-57f4fb9dc-2h8p9\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.412450 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.430420 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-545fff646c-dqt5j" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.474661 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw25r\" (UniqueName: \"kubernetes.io/projected/38a3a342-d88d-4c78-a3d0-19755678d48b-kube-api-access-tw25r\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.474774 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.474802 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38a3a342-d88d-4c78-a3d0-19755678d48b-logs\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.474946 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-combined-ca-bundle\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.474987 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data-custom\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.485789 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.576985 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data-custom\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.577035 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw25r\" (UniqueName: \"kubernetes.io/projected/38a3a342-d88d-4c78-a3d0-19755678d48b-kube-api-access-tw25r\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.577096 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.577111 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38a3a342-d88d-4c78-a3d0-19755678d48b-logs\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.577202 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-combined-ca-bundle\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.585720 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38a3a342-d88d-4c78-a3d0-19755678d48b-logs\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.596457 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data-custom\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.597189 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-combined-ca-bundle\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.599354 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.604645 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw25r\" (UniqueName: \"kubernetes.io/projected/38a3a342-d88d-4c78-a3d0-19755678d48b-kube-api-access-tw25r\") pod \"barbican-api-8547558994-qzqlz\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.643887 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.806541 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c3439aa5-3edd-49b2-8d83-5a34cd55764b","Type":"ContainerStarted","Data":"e497ebf419188b2973657e53aadfb8650798b12612159ae4d0ebc4f07718df87"} Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.806581 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c3439aa5-3edd-49b2-8d83-5a34cd55764b","Type":"ContainerStarted","Data":"2ee597a40feb69270dfb0da558b4dcc29edce818ba7e3d1462ef27563587d7b1"} Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.806591 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c3439aa5-3edd-49b2-8d83-5a34cd55764b","Type":"ContainerStarted","Data":"214e5b69dfd0a1b31b2603e521a20befd9ab7f0d0f617a81700e2785c2be79ed"} Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.807344 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.814125 4725 generic.go:334] "Generic (PLEG): container finished" podID="f817188c-5563-4b93-abe7-94305a5c95a9" containerID="bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1" exitCode=2 Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.814208 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerDied","Data":"bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1"} Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.836802 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=7.836781828 podStartE2EDuration="7.836781828s" podCreationTimestamp="2026-02-27 06:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:20.827817836 +0000 UTC m=+1319.290438415" watchObservedRunningTime="2026-02-27 06:32:20.836781828 +0000 UTC m=+1319.299402397" Feb 27 06:32:20 crc kubenswrapper[4725]: I0227 06:32:20.989377 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64b777b644-7s9mh"] Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.057584 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57f4fb9dc-2h8p9"] Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.067370 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-545fff646c-dqt5j"] Feb 27 06:32:21 crc kubenswrapper[4725]: W0227 06:32:21.072381 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4edb4e2_0feb_4075_a823_c02d954872d3.slice/crio-74725f41d1b3c1ab09fadc8f0a02bca8c4ad12241d041e493e456e59932b80ec WatchSource:0}: Error finding container 74725f41d1b3c1ab09fadc8f0a02bca8c4ad12241d041e493e456e59932b80ec: Status 404 returned error can't find the container with id 74725f41d1b3c1ab09fadc8f0a02bca8c4ad12241d041e493e456e59932b80ec Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.230633 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8547558994-qzqlz"] Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.351863 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.395193 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-scripts\") pod \"6a306be2-547c-404f-afc1-4f4639cf7a28\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.395264 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-combined-ca-bundle\") pod \"6a306be2-547c-404f-afc1-4f4639cf7a28\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.395481 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmj4c\" (UniqueName: \"kubernetes.io/projected/6a306be2-547c-404f-afc1-4f4639cf7a28-kube-api-access-bmj4c\") pod \"6a306be2-547c-404f-afc1-4f4639cf7a28\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.395553 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-db-sync-config-data\") pod \"6a306be2-547c-404f-afc1-4f4639cf7a28\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.395653 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-config-data\") pod \"6a306be2-547c-404f-afc1-4f4639cf7a28\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.395689 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a306be2-547c-404f-afc1-4f4639cf7a28-etc-machine-id\") pod \"6a306be2-547c-404f-afc1-4f4639cf7a28\" (UID: \"6a306be2-547c-404f-afc1-4f4639cf7a28\") " Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.397372 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a306be2-547c-404f-afc1-4f4639cf7a28-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6a306be2-547c-404f-afc1-4f4639cf7a28" (UID: "6a306be2-547c-404f-afc1-4f4639cf7a28"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.406457 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a306be2-547c-404f-afc1-4f4639cf7a28-kube-api-access-bmj4c" (OuterVolumeSpecName: "kube-api-access-bmj4c") pod "6a306be2-547c-404f-afc1-4f4639cf7a28" (UID: "6a306be2-547c-404f-afc1-4f4639cf7a28"). InnerVolumeSpecName "kube-api-access-bmj4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.413370 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-scripts" (OuterVolumeSpecName: "scripts") pod "6a306be2-547c-404f-afc1-4f4639cf7a28" (UID: "6a306be2-547c-404f-afc1-4f4639cf7a28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.416309 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6a306be2-547c-404f-afc1-4f4639cf7a28" (UID: "6a306be2-547c-404f-afc1-4f4639cf7a28"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.455641 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a306be2-547c-404f-afc1-4f4639cf7a28" (UID: "6a306be2-547c-404f-afc1-4f4639cf7a28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.480836 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-config-data" (OuterVolumeSpecName: "config-data") pod "6a306be2-547c-404f-afc1-4f4639cf7a28" (UID: "6a306be2-547c-404f-afc1-4f4639cf7a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.498277 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.498332 4725 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a306be2-547c-404f-afc1-4f4639cf7a28-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.498346 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.498358 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.498371 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmj4c\" (UniqueName: \"kubernetes.io/projected/6a306be2-547c-404f-afc1-4f4639cf7a28-kube-api-access-bmj4c\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.498383 4725 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a306be2-547c-404f-afc1-4f4639cf7a28-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.834519 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8547558994-qzqlz" event={"ID":"38a3a342-d88d-4c78-a3d0-19755678d48b","Type":"ContainerStarted","Data":"db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.834817 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8547558994-qzqlz" event={"ID":"38a3a342-d88d-4c78-a3d0-19755678d48b","Type":"ContainerStarted","Data":"f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.834827 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8547558994-qzqlz" event={"ID":"38a3a342-d88d-4c78-a3d0-19755678d48b","Type":"ContainerStarted","Data":"6d5e1b3f853a9dfe1ba1484108665dc97f315001c3b0956a28fdfd19e4153597"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.835972 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.835994 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.837871 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56865cdb4-9hs85" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.838961 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5v8pq" event={"ID":"6a306be2-547c-404f-afc1-4f4639cf7a28","Type":"ContainerDied","Data":"4d6cb75cb078aa6e8e144ac051b379e31055dfa2e4c2cfffe56d4f4962e34ac3"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.838985 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d6cb75cb078aa6e8e144ac051b379e31055dfa2e4c2cfffe56d4f4962e34ac3" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.839040 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5v8pq" Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.843098 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-545fff646c-dqt5j" event={"ID":"e4edb4e2-0feb-4075-a823-c02d954872d3","Type":"ContainerStarted","Data":"74725f41d1b3c1ab09fadc8f0a02bca8c4ad12241d041e493e456e59932b80ec"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.848134 4725 generic.go:334] "Generic (PLEG): container finished" podID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerID="3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa" exitCode=0 Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.848190 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" event={"ID":"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b","Type":"ContainerDied","Data":"3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.848214 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" event={"ID":"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b","Type":"ContainerStarted","Data":"bdb96f8e7e8f8548ddd1e06d2b8b1e07123ce685bf7247a2c9509d01cbddfe1b"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.855014 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" event={"ID":"d495df58-14fc-4eb9-a8f1-104b6ca6ce22","Type":"ContainerStarted","Data":"b92bf3024e1e0c0b16fe97289d54ec1f9aa4a72937f87b6588ea9613ec096bc3"} Feb 27 06:32:21 crc kubenswrapper[4725]: I0227 06:32:21.870775 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8547558994-qzqlz" podStartSLOduration=1.8707604880000002 podStartE2EDuration="1.870760488s" podCreationTimestamp="2026-02-27 06:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:21.861792536 +0000 UTC m=+1320.324413115" watchObservedRunningTime="2026-02-27 06:32:21.870760488 +0000 UTC m=+1320.333381057" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.089436 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:22 crc kubenswrapper[4725]: E0227 06:32:22.089784 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a306be2-547c-404f-afc1-4f4639cf7a28" containerName="cinder-db-sync" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.089799 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a306be2-547c-404f-afc1-4f4639cf7a28" containerName="cinder-db-sync" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.089991 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a306be2-547c-404f-afc1-4f4639cf7a28" containerName="cinder-db-sync" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.090880 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.094413 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5kxt8" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.094612 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.100799 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.100831 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.133893 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.217346 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xd8n\" (UniqueName: \"kubernetes.io/projected/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-kube-api-access-5xd8n\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.217401 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-scripts\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.217442 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.217497 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.217579 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.217595 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.320120 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.320184 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.320228 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xd8n\" (UniqueName: \"kubernetes.io/projected/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-kube-api-access-5xd8n\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.320266 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-scripts\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.320319 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.320374 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.322360 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.331338 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.331600 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.331763 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.365242 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f4fb9dc-2h8p9"] Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.374604 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.376390 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.377105 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-scripts\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.382836 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xd8n\" (UniqueName: \"kubernetes.io/projected/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-kube-api-access-5xd8n\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.429235 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data\") pod \"cinder-scheduler-0\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.479180 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-674d64fdcf-wfmcl"] Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.499298 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.499334 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.499582 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.520249 4725 scope.go:117] "RemoveContainer" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" Feb 27 06:32:22 crc kubenswrapper[4725]: E0227 06:32:22.520563 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(98e9aa25-5670-466b-92a2-26b711b3ccf4)\"" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.623042 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-674d64fdcf-wfmcl"] Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.629908 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-config\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.629988 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-sb\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.630066 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-swift-storage-0\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.630093 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-nb\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.630159 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc9qs\" (UniqueName: \"kubernetes.io/projected/29494296-f5e8-4f29-8123-83f487cace05-kube-api-access-dc9qs\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.630205 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-svc\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.681527 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.684918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.690672 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.705091 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.713192 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5kxt8" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.722726 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732091 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-svc\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732159 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732192 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6906bd-905c-49d9-92d4-3f59b948ad2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732213 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sccbd\" (UniqueName: \"kubernetes.io/projected/3e6906bd-905c-49d9-92d4-3f59b948ad2a-kube-api-access-sccbd\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732239 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732345 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-config\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732382 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-sb\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732436 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-swift-storage-0\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732485 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-nb\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732529 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732568 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-scripts\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732592 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc9qs\" (UniqueName: \"kubernetes.io/projected/29494296-f5e8-4f29-8123-83f487cace05-kube-api-access-dc9qs\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.732617 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6906bd-905c-49d9-92d4-3f59b948ad2a-logs\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.733539 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-svc\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.733587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-config\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.733686 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.734497 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-swift-storage-0\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.734822 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-nb\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.735874 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-sb\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.748943 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc9qs\" (UniqueName: \"kubernetes.io/projected/29494296-f5e8-4f29-8123-83f487cace05-kube-api-access-dc9qs\") pod \"dnsmasq-dns-674d64fdcf-wfmcl\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834454 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sccbd\" (UniqueName: \"kubernetes.io/projected/3e6906bd-905c-49d9-92d4-3f59b948ad2a-kube-api-access-sccbd\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834513 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834680 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834736 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-scripts\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834764 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6906bd-905c-49d9-92d4-3f59b948ad2a-logs\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834803 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834827 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6906bd-905c-49d9-92d4-3f59b948ad2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.834916 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6906bd-905c-49d9-92d4-3f59b948ad2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.836113 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6906bd-905c-49d9-92d4-3f59b948ad2a-logs\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.839321 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-scripts\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.839429 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.839954 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.846505 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.853739 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sccbd\" (UniqueName: \"kubernetes.io/projected/3e6906bd-905c-49d9-92d4-3f59b948ad2a-kube-api-access-sccbd\") pod \"cinder-api-0\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " pod="openstack/cinder-api-0" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.894107 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:22 crc kubenswrapper[4725]: I0227 06:32:22.999836 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.129081 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.129184 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.129200 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.726896 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-674d64fdcf-wfmcl"] Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.857425 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.870144 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.892179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-545fff646c-dqt5j" event={"ID":"e4edb4e2-0feb-4075-a823-c02d954872d3","Type":"ContainerStarted","Data":"8dbfb8eb77c7a49673c893b949b2f0290a883296cb046c4124984bf82c749ca4"} Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.894961 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" event={"ID":"29494296-f5e8-4f29-8123-83f487cace05","Type":"ContainerStarted","Data":"585d05a9a25e5d36fc17351a0803ed32fa3d5944f388723cb27e87560c3a03d7"} Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.920146 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" event={"ID":"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b","Type":"ContainerStarted","Data":"91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120"} Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.920492 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" podUID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerName="dnsmasq-dns" containerID="cri-o://91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120" gracePeriod=10 Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.920757 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.938736 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" event={"ID":"d495df58-14fc-4eb9-a8f1-104b6ca6ce22","Type":"ContainerStarted","Data":"d627dadcdff74d303a1f46e1d671bbb0060a4b6ab3a48747805d927dce1f5aa3"} Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.951475 4725 generic.go:334] "Generic (PLEG): container finished" podID="f817188c-5563-4b93-abe7-94305a5c95a9" containerID="8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff" exitCode=0 Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.951809 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.952526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerDied","Data":"8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff"} Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.967926 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" podStartSLOduration=4.967902872 podStartE2EDuration="4.967902872s" podCreationTimestamp="2026-02-27 06:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:24.950251345 +0000 UTC m=+1323.412871924" watchObservedRunningTime="2026-02-27 06:32:24.967902872 +0000 UTC m=+1323.430523441" Feb 27 06:32:24 crc kubenswrapper[4725]: I0227 06:32:24.974394 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 27 06:32:25 crc kubenswrapper[4725]: I0227 06:32:25.133437 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.180:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:32:25 crc kubenswrapper[4725]: I0227 06:32:25.827419 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:25 crc kubenswrapper[4725]: I0227 06:32:25.975117 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" event={"ID":"d495df58-14fc-4eb9-a8f1-104b6ca6ce22","Type":"ContainerStarted","Data":"acd1ef59c917f7c45a0dff1d02c24d200da36d9f686d4327f857ebe181ca10fa"} Feb 27 06:32:25 crc kubenswrapper[4725]: I0227 06:32:25.986407 4725 generic.go:334] "Generic (PLEG): container finished" podID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerID="bde0c1fffb3692c65b3046d03b17cbac44a4d9aaeb7e486a9095e3acbf9352bf" exitCode=137 Feb 27 06:32:25 crc kubenswrapper[4725]: I0227 06:32:25.989675 4725 generic.go:334] "Generic (PLEG): container finished" podID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerID="b1aa00d211f4acd6b408c48a72dc95d48f813b8e5132ac819a91abc1e5a48f5b" exitCode=137 Feb 27 06:32:25 crc kubenswrapper[4725]: I0227 06:32:25.989828 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc5ddffd5-r9bpn" event={"ID":"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd","Type":"ContainerDied","Data":"bde0c1fffb3692c65b3046d03b17cbac44a4d9aaeb7e486a9095e3acbf9352bf"} Feb 27 06:32:25 crc kubenswrapper[4725]: I0227 06:32:25.989930 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc5ddffd5-r9bpn" event={"ID":"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd","Type":"ContainerDied","Data":"b1aa00d211f4acd6b408c48a72dc95d48f813b8e5132ac819a91abc1e5a48f5b"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.000874 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-sb\") pod \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.000980 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-swift-storage-0\") pod \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.001006 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-config\") pod \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.001040 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-nb\") pod \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.001123 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nzm9\" (UniqueName: \"kubernetes.io/projected/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-kube-api-access-2nzm9\") pod \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.001207 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-svc\") pod \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\" (UID: \"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.018479 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-kube-api-access-2nzm9" (OuterVolumeSpecName: "kube-api-access-2nzm9") pod "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" (UID: "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b"). InnerVolumeSpecName "kube-api-access-2nzm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.029296 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64b777b644-7s9mh" podStartSLOduration=3.650887779 podStartE2EDuration="7.029263482s" podCreationTimestamp="2026-02-27 06:32:19 +0000 UTC" firstStartedPulling="2026-02-27 06:32:20.998416294 +0000 UTC m=+1319.461036863" lastFinishedPulling="2026-02-27 06:32:24.376791997 +0000 UTC m=+1322.839412566" observedRunningTime="2026-02-27 06:32:26.008256981 +0000 UTC m=+1324.470877560" watchObservedRunningTime="2026-02-27 06:32:26.029263482 +0000 UTC m=+1324.491884051" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.038603 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6906bd-905c-49d9-92d4-3f59b948ad2a","Type":"ContainerStarted","Data":"ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.038647 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6906bd-905c-49d9-92d4-3f59b948ad2a","Type":"ContainerStarted","Data":"1d30bde061641191ecaf9f217ebb6e0259dd12e5c562ef082fd117bbc86e7756"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.040172 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aae62a22-d2ec-4af6-9e82-b7a3aafb9188","Type":"ContainerStarted","Data":"9beb7290dd3b0aad9b0cf014519a3bf3173d0599bd81577bf8a3dc35f36b6cb5"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.048101 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-545fff646c-dqt5j" event={"ID":"e4edb4e2-0feb-4075-a823-c02d954872d3","Type":"ContainerStarted","Data":"947d51458f2bd5a9ad27d3be7336181c7c12585637ed6adb75da66d31d6a8009"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.055445 4725 generic.go:334] "Generic (PLEG): container finished" podID="29494296-f5e8-4f29-8123-83f487cace05" containerID="9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a" exitCode=0 Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.055550 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" event={"ID":"29494296-f5e8-4f29-8123-83f487cace05","Type":"ContainerDied","Data":"9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.070689 4725 generic.go:334] "Generic (PLEG): container finished" podID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerID="91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120" exitCode=0 Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.071564 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.071913 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" event={"ID":"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b","Type":"ContainerDied","Data":"91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.071943 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f4fb9dc-2h8p9" event={"ID":"af2f6a7b-ad66-4ad0-983d-9fe66a70e20b","Type":"ContainerDied","Data":"bdb96f8e7e8f8548ddd1e06d2b8b1e07123ce685bf7247a2c9509d01cbddfe1b"} Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.071959 4725 scope.go:117] "RemoveContainer" containerID="91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.092804 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-config" (OuterVolumeSpecName: "config") pod "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" (UID: "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.093566 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" (UID: "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.097755 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" (UID: "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.099991 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-545fff646c-dqt5j" podStartSLOduration=3.812869792 podStartE2EDuration="7.09997476s" podCreationTimestamp="2026-02-27 06:32:19 +0000 UTC" firstStartedPulling="2026-02-27 06:32:21.075522002 +0000 UTC m=+1319.538142581" lastFinishedPulling="2026-02-27 06:32:24.36262698 +0000 UTC m=+1322.825247549" observedRunningTime="2026-02-27 06:32:26.065143811 +0000 UTC m=+1324.527764380" watchObservedRunningTime="2026-02-27 06:32:26.09997476 +0000 UTC m=+1324.562595319" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.103038 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nzm9\" (UniqueName: \"kubernetes.io/projected/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-kube-api-access-2nzm9\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.103059 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.103071 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.103080 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.124208 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" (UID: "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.128099 4725 scope.go:117] "RemoveContainer" containerID="3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.135740 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" (UID: "af2f6a7b-ad66-4ad0-983d-9fe66a70e20b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.138563 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.178457 4725 scope.go:117] "RemoveContainer" containerID="91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120" Feb 27 06:32:26 crc kubenswrapper[4725]: E0227 06:32:26.179024 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120\": container with ID starting with 91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120 not found: ID does not exist" containerID="91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.179137 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120"} err="failed to get container status \"91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120\": rpc error: code = NotFound desc = could not find container \"91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120\": container with ID starting with 91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120 not found: ID does not exist" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.179226 4725 scope.go:117] "RemoveContainer" containerID="3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa" Feb 27 06:32:26 crc kubenswrapper[4725]: E0227 06:32:26.179804 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa\": container with ID starting with 3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa not found: ID does not exist" containerID="3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.179891 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa"} err="failed to get container status \"3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa\": rpc error: code = NotFound desc = could not find container \"3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa\": container with ID starting with 3f7a62325176d381ea1eceda99c5a29493550ca0b612a32cd9aad28a64b72ffa not found: ID does not exist" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.204562 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.204587 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.206881 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.305814 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-logs\") pod \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.306069 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-horizon-secret-key\") pod \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.306145 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q25zn\" (UniqueName: \"kubernetes.io/projected/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-kube-api-access-q25zn\") pod \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.306235 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-config-data\") pod \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.306270 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-scripts\") pod \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\" (UID: \"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd\") " Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.306270 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-logs" (OuterVolumeSpecName: "logs") pod "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" (UID: "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.306729 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.310527 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" (UID: "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.310894 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-kube-api-access-q25zn" (OuterVolumeSpecName: "kube-api-access-q25zn") pod "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" (UID: "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd"). InnerVolumeSpecName "kube-api-access-q25zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.343978 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-config-data" (OuterVolumeSpecName: "config-data") pod "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" (UID: "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.363876 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-scripts" (OuterVolumeSpecName: "scripts") pod "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" (UID: "ba49d2a8-4ee2-4fe4-87dd-d875279c77cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.411907 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.411942 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q25zn\" (UniqueName: \"kubernetes.io/projected/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-kube-api-access-q25zn\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.411953 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.411961 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.507485 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f4fb9dc-2h8p9"] Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.515239 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57f4fb9dc-2h8p9"] Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.588354 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b44778b65-pk2td"] Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.588637 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b44778b65-pk2td" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-api" containerID="cri-o://0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d" gracePeriod=30 Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.589092 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b44778b65-pk2td" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-httpd" containerID="cri-o://2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f" gracePeriod=30 Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.600833 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b86d8c849-9kc54"] Feb 27 06:32:26 crc kubenswrapper[4725]: E0227 06:32:26.601253 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerName="dnsmasq-dns" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.601269 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerName="dnsmasq-dns" Feb 27 06:32:26 crc kubenswrapper[4725]: E0227 06:32:26.601306 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon-log" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.601313 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon-log" Feb 27 06:32:26 crc kubenswrapper[4725]: E0227 06:32:26.601330 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerName="init" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.601337 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerName="init" Feb 27 06:32:26 crc kubenswrapper[4725]: E0227 06:32:26.601348 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.601354 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.601541 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.601559 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" containerName="dnsmasq-dns" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.601580 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" containerName="horizon-log" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.602546 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.608868 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.615795 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b86d8c849-9kc54"] Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.713736 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6b44778b65-pk2td" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9696/\": read tcp 10.217.0.2:38520->10.217.0.175:9696: read: connection reset by peer" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.719818 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-combined-ca-bundle\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.719959 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxv8h\" (UniqueName: \"kubernetes.io/projected/bdb517d6-290d-43f7-9791-297c8dace84e-kube-api-access-cxv8h\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.720054 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-ovndb-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.720113 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-config\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.720306 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-internal-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.720371 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-httpd-config\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.720466 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-public-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.822650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxv8h\" (UniqueName: \"kubernetes.io/projected/bdb517d6-290d-43f7-9791-297c8dace84e-kube-api-access-cxv8h\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.822748 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-ovndb-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.822774 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-config\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.822808 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-internal-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.822826 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-httpd-config\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.822861 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-public-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.822883 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-combined-ca-bundle\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.827933 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-ovndb-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.831654 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-public-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.832153 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-internal-tls-certs\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.834440 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-combined-ca-bundle\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.840904 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-httpd-config\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.841617 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bdb517d6-290d-43f7-9791-297c8dace84e-config\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.848060 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxv8h\" (UniqueName: \"kubernetes.io/projected/bdb517d6-290d-43f7-9791-297c8dace84e-kube-api-access-cxv8h\") pod \"neutron-5b86d8c849-9kc54\" (UID: \"bdb517d6-290d-43f7-9791-297c8dace84e\") " pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:26 crc kubenswrapper[4725]: I0227 06:32:26.970126 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.100062 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" event={"ID":"29494296-f5e8-4f29-8123-83f487cace05","Type":"ContainerStarted","Data":"378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e"} Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.100190 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.112934 4725 generic.go:334] "Generic (PLEG): container finished" podID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerID="2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f" exitCode=0 Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.112978 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b44778b65-pk2td" event={"ID":"4598bdbc-f18b-4709-baa9-013d097a4dfc","Type":"ContainerDied","Data":"2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f"} Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.114996 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" podStartSLOduration=5.114983747 podStartE2EDuration="5.114983747s" podCreationTimestamp="2026-02-27 06:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:27.114315778 +0000 UTC m=+1325.576936347" watchObservedRunningTime="2026-02-27 06:32:27.114983747 +0000 UTC m=+1325.577604316" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.123045 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc5ddffd5-r9bpn" event={"ID":"ba49d2a8-4ee2-4fe4-87dd-d875279c77cd","Type":"ContainerDied","Data":"737101e1fa91aa437b70cc13dd0baba7cecea4a1fb214a1c6b71527ed42d010f"} Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.123811 4725 scope.go:117] "RemoveContainer" containerID="bde0c1fffb3692c65b3046d03b17cbac44a4d9aaeb7e486a9095e3acbf9352bf" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.123353 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc5ddffd5-r9bpn" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.131916 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6906bd-905c-49d9-92d4-3f59b948ad2a","Type":"ContainerStarted","Data":"a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19"} Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.132063 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api-log" containerID="cri-o://ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756" gracePeriod=30 Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.132336 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.132476 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api" containerID="cri-o://a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19" gracePeriod=30 Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.140843 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aae62a22-d2ec-4af6-9e82-b7a3aafb9188","Type":"ContainerStarted","Data":"7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb"} Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.209863 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.209843385 podStartE2EDuration="5.209843385s" podCreationTimestamp="2026-02-27 06:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:27.169190691 +0000 UTC m=+1325.631811270" watchObservedRunningTime="2026-02-27 06:32:27.209843385 +0000 UTC m=+1325.672463954" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.228796 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cc5ddffd5-r9bpn"] Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.263318 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cc5ddffd5-r9bpn"] Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.358683 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7959dbc8c4-8fc74"] Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.360382 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.365261 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.365869 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.366398 4725 scope.go:117] "RemoveContainer" containerID="b1aa00d211f4acd6b408c48a72dc95d48f813b8e5132ac819a91abc1e5a48f5b" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.372851 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7959dbc8c4-8fc74"] Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.442573 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-config-data-custom\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.442836 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-internal-tls-certs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.442866 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-public-tls-certs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.442976 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2cds\" (UniqueName: \"kubernetes.io/projected/db039076-6d42-4d4e-b0d2-479ae5a91408-kube-api-access-b2cds\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.443015 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-combined-ca-bundle\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.443039 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db039076-6d42-4d4e-b0d2-479ae5a91408-logs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.443072 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-config-data\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.544764 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2cds\" (UniqueName: \"kubernetes.io/projected/db039076-6d42-4d4e-b0d2-479ae5a91408-kube-api-access-b2cds\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.544826 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-combined-ca-bundle\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.544855 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db039076-6d42-4d4e-b0d2-479ae5a91408-logs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.544888 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-config-data\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.544910 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-config-data-custom\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.544937 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-internal-tls-certs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.544963 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-public-tls-certs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.545832 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db039076-6d42-4d4e-b0d2-479ae5a91408-logs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.552696 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-internal-tls-certs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.552892 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-combined-ca-bundle\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.553238 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-public-tls-certs\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.553816 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-config-data-custom\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.554736 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db039076-6d42-4d4e-b0d2-479ae5a91408-config-data\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.564720 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2cds\" (UniqueName: \"kubernetes.io/projected/db039076-6d42-4d4e-b0d2-479ae5a91408-kube-api-access-b2cds\") pod \"barbican-api-7959dbc8c4-8fc74\" (UID: \"db039076-6d42-4d4e-b0d2-479ae5a91408\") " pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.724322 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:27 crc kubenswrapper[4725]: I0227 06:32:27.841260 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b86d8c849-9kc54"] Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.153477 4725 generic.go:334] "Generic (PLEG): container finished" podID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerID="ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756" exitCode=143 Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.153552 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6906bd-905c-49d9-92d4-3f59b948ad2a","Type":"ContainerDied","Data":"ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756"} Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.154551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b86d8c849-9kc54" event={"ID":"bdb517d6-290d-43f7-9791-297c8dace84e","Type":"ContainerStarted","Data":"c167fb839248ffbf407186925d0b8dca4ed3fef0d1c0e19274d052e6d600d344"} Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.166880 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aae62a22-d2ec-4af6-9e82-b7a3aafb9188","Type":"ContainerStarted","Data":"b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1"} Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.202330 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.890437275 podStartE2EDuration="6.202310386s" podCreationTimestamp="2026-02-27 06:32:22 +0000 UTC" firstStartedPulling="2026-02-27 06:32:24.902139372 +0000 UTC m=+1323.364759931" lastFinishedPulling="2026-02-27 06:32:25.214012473 +0000 UTC m=+1323.676633042" observedRunningTime="2026-02-27 06:32:28.188488207 +0000 UTC m=+1326.651108776" watchObservedRunningTime="2026-02-27 06:32:28.202310386 +0000 UTC m=+1326.664930955" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.215714 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7959dbc8c4-8fc74"] Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.270110 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2f6a7b-ad66-4ad0-983d-9fe66a70e20b" path="/var/lib/kubelet/pods/af2f6a7b-ad66-4ad0-983d-9fe66a70e20b/volumes" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.270792 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba49d2a8-4ee2-4fe4-87dd-d875279c77cd" path="/var/lib/kubelet/pods/ba49d2a8-4ee2-4fe4-87dd-d875279c77cd/volumes" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.766210 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.877967 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-public-tls-certs\") pod \"4598bdbc-f18b-4709-baa9-013d097a4dfc\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.878016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-config\") pod \"4598bdbc-f18b-4709-baa9-013d097a4dfc\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.878053 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-httpd-config\") pod \"4598bdbc-f18b-4709-baa9-013d097a4dfc\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.878109 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-combined-ca-bundle\") pod \"4598bdbc-f18b-4709-baa9-013d097a4dfc\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.878148 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdf2r\" (UniqueName: \"kubernetes.io/projected/4598bdbc-f18b-4709-baa9-013d097a4dfc-kube-api-access-cdf2r\") pod \"4598bdbc-f18b-4709-baa9-013d097a4dfc\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.878280 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-internal-tls-certs\") pod \"4598bdbc-f18b-4709-baa9-013d097a4dfc\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.878333 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-ovndb-tls-certs\") pod \"4598bdbc-f18b-4709-baa9-013d097a4dfc\" (UID: \"4598bdbc-f18b-4709-baa9-013d097a4dfc\") " Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.892658 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4598bdbc-f18b-4709-baa9-013d097a4dfc-kube-api-access-cdf2r" (OuterVolumeSpecName: "kube-api-access-cdf2r") pod "4598bdbc-f18b-4709-baa9-013d097a4dfc" (UID: "4598bdbc-f18b-4709-baa9-013d097a4dfc"). InnerVolumeSpecName "kube-api-access-cdf2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.900004 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4598bdbc-f18b-4709-baa9-013d097a4dfc" (UID: "4598bdbc-f18b-4709-baa9-013d097a4dfc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.980832 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.980863 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdf2r\" (UniqueName: \"kubernetes.io/projected/4598bdbc-f18b-4709-baa9-013d097a4dfc-kube-api-access-cdf2r\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:28 crc kubenswrapper[4725]: I0227 06:32:28.989504 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4598bdbc-f18b-4709-baa9-013d097a4dfc" (UID: "4598bdbc-f18b-4709-baa9-013d097a4dfc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.024824 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4598bdbc-f18b-4709-baa9-013d097a4dfc" (UID: "4598bdbc-f18b-4709-baa9-013d097a4dfc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.046572 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-config" (OuterVolumeSpecName: "config") pod "4598bdbc-f18b-4709-baa9-013d097a4dfc" (UID: "4598bdbc-f18b-4709-baa9-013d097a4dfc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.047397 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4598bdbc-f18b-4709-baa9-013d097a4dfc" (UID: "4598bdbc-f18b-4709-baa9-013d097a4dfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.082462 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.082504 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.082514 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.082525 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.127045 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4598bdbc-f18b-4709-baa9-013d097a4dfc" (UID: "4598bdbc-f18b-4709-baa9-013d097a4dfc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.184188 4725 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4598bdbc-f18b-4709-baa9-013d097a4dfc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.186693 4725 generic.go:334] "Generic (PLEG): container finished" podID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerID="0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d" exitCode=0 Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.186753 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b44778b65-pk2td" event={"ID":"4598bdbc-f18b-4709-baa9-013d097a4dfc","Type":"ContainerDied","Data":"0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d"} Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.186778 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b44778b65-pk2td" event={"ID":"4598bdbc-f18b-4709-baa9-013d097a4dfc","Type":"ContainerDied","Data":"10b80694baa957c82c70925cefebc2813af3180201ce0bc4a33b15ffaeff96d4"} Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.186793 4725 scope.go:117] "RemoveContainer" containerID="2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.186926 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b44778b65-pk2td" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.211643 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b86d8c849-9kc54" event={"ID":"bdb517d6-290d-43f7-9791-297c8dace84e","Type":"ContainerStarted","Data":"c2007809dfc4add4bd064013667f38bcff46c6781a4897ab8a09d61575517c63"} Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.211696 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b86d8c849-9kc54" event={"ID":"bdb517d6-290d-43f7-9791-297c8dace84e","Type":"ContainerStarted","Data":"dba0b45bce2226857e1c89c9e3488e3e74613792d567a8fa449b2eda77f0a230"} Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.212920 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.223622 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7959dbc8c4-8fc74" event={"ID":"db039076-6d42-4d4e-b0d2-479ae5a91408","Type":"ContainerStarted","Data":"f0118377df4faf1e801010204d0d95b006f653954b332b0e5fdcecd24b7bae51"} Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.223668 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7959dbc8c4-8fc74" event={"ID":"db039076-6d42-4d4e-b0d2-479ae5a91408","Type":"ContainerStarted","Data":"0ebc07483d5912ba82e5c964b2aa80105f4f33980f358e21c40f31df1bff3d58"} Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.223680 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7959dbc8c4-8fc74" event={"ID":"db039076-6d42-4d4e-b0d2-479ae5a91408","Type":"ContainerStarted","Data":"09e6c600b675fba1f2fc8b09c65048e4036e22cf06ebd62515a34b21766eec47"} Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.223730 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.223905 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.247622 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b86d8c849-9kc54" podStartSLOduration=3.247597194 podStartE2EDuration="3.247597194s" podCreationTimestamp="2026-02-27 06:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:29.236260785 +0000 UTC m=+1327.698881354" watchObservedRunningTime="2026-02-27 06:32:29.247597194 +0000 UTC m=+1327.710217763" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.249193 4725 scope.go:117] "RemoveContainer" containerID="0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.273776 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b44778b65-pk2td"] Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.287331 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6b44778b65-pk2td"] Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.305229 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7959dbc8c4-8fc74" podStartSLOduration=2.305206734 podStartE2EDuration="2.305206734s" podCreationTimestamp="2026-02-27 06:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:29.278858573 +0000 UTC m=+1327.741479142" watchObservedRunningTime="2026-02-27 06:32:29.305206734 +0000 UTC m=+1327.767827303" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.329267 4725 scope.go:117] "RemoveContainer" containerID="2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f" Feb 27 06:32:29 crc kubenswrapper[4725]: E0227 06:32:29.329838 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f\": container with ID starting with 2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f not found: ID does not exist" containerID="2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.329937 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f"} err="failed to get container status \"2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f\": rpc error: code = NotFound desc = could not find container \"2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f\": container with ID starting with 2a03942c66babcf8f661c24abe35c946a4542c59f42cca03d3876d14e009221f not found: ID does not exist" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.330011 4725 scope.go:117] "RemoveContainer" containerID="0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d" Feb 27 06:32:29 crc kubenswrapper[4725]: E0227 06:32:29.330548 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d\": container with ID starting with 0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d not found: ID does not exist" containerID="0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d" Feb 27 06:32:29 crc kubenswrapper[4725]: I0227 06:32:29.330598 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d"} err="failed to get container status \"0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d\": rpc error: code = NotFound desc = could not find container \"0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d\": container with ID starting with 0c50d1e4146e9b4fd9eeef03579e84f3f93307af5088f29b7b835ef198445a2d not found: ID does not exist" Feb 27 06:32:30 crc kubenswrapper[4725]: I0227 06:32:30.263854 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" path="/var/lib/kubelet/pods/4598bdbc-f18b-4709-baa9-013d097a4dfc/volumes" Feb 27 06:32:31 crc kubenswrapper[4725]: I0227 06:32:31.837480 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56865cdb4-9hs85" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Feb 27 06:32:31 crc kubenswrapper[4725]: I0227 06:32:31.881751 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.100708 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.573975 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.574038 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.723556 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.896462 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.941686 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.994453 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bbccbc7cf-sts7s"] Feb 27 06:32:32 crc kubenswrapper[4725]: I0227 06:32:32.994952 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" podUID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerName="dnsmasq-dns" containerID="cri-o://a2015d783012eb40f3a5917c9c739000fc14f1a0864079acfb13cd89bf9b3a4a" gracePeriod=10 Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.272442 4725 generic.go:334] "Generic (PLEG): container finished" podID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerID="a2015d783012eb40f3a5917c9c739000fc14f1a0864079acfb13cd89bf9b3a4a" exitCode=0 Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.272496 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" event={"ID":"480f17b9-c37d-4cc1-a611-9500bae66f11","Type":"ContainerDied","Data":"a2015d783012eb40f3a5917c9c739000fc14f1a0864079acfb13cd89bf9b3a4a"} Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.320159 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.507050 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.598250 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-svc\") pod \"480f17b9-c37d-4cc1-a611-9500bae66f11\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.598345 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww7r7\" (UniqueName: \"kubernetes.io/projected/480f17b9-c37d-4cc1-a611-9500bae66f11-kube-api-access-ww7r7\") pod \"480f17b9-c37d-4cc1-a611-9500bae66f11\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.598403 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-nb\") pod \"480f17b9-c37d-4cc1-a611-9500bae66f11\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.598444 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-sb\") pod \"480f17b9-c37d-4cc1-a611-9500bae66f11\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.598527 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-swift-storage-0\") pod \"480f17b9-c37d-4cc1-a611-9500bae66f11\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.598587 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-config\") pod \"480f17b9-c37d-4cc1-a611-9500bae66f11\" (UID: \"480f17b9-c37d-4cc1-a611-9500bae66f11\") " Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.626584 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480f17b9-c37d-4cc1-a611-9500bae66f11-kube-api-access-ww7r7" (OuterVolumeSpecName: "kube-api-access-ww7r7") pod "480f17b9-c37d-4cc1-a611-9500bae66f11" (UID: "480f17b9-c37d-4cc1-a611-9500bae66f11"). InnerVolumeSpecName "kube-api-access-ww7r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.700657 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww7r7\" (UniqueName: \"kubernetes.io/projected/480f17b9-c37d-4cc1-a611-9500bae66f11-kube-api-access-ww7r7\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.736033 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "480f17b9-c37d-4cc1-a611-9500bae66f11" (UID: "480f17b9-c37d-4cc1-a611-9500bae66f11"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.736725 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "480f17b9-c37d-4cc1-a611-9500bae66f11" (UID: "480f17b9-c37d-4cc1-a611-9500bae66f11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.739640 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "480f17b9-c37d-4cc1-a611-9500bae66f11" (UID: "480f17b9-c37d-4cc1-a611-9500bae66f11"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.751687 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "480f17b9-c37d-4cc1-a611-9500bae66f11" (UID: "480f17b9-c37d-4cc1-a611-9500bae66f11"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.788869 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-config" (OuterVolumeSpecName: "config") pod "480f17b9-c37d-4cc1-a611-9500bae66f11" (UID: "480f17b9-c37d-4cc1-a611-9500bae66f11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.807426 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.807461 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.807471 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.807479 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:33 crc kubenswrapper[4725]: I0227 06:32:33.807487 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/480f17b9-c37d-4cc1-a611-9500bae66f11-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.150058 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.176836 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.283959 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.283951 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bbccbc7cf-sts7s" event={"ID":"480f17b9-c37d-4cc1-a611-9500bae66f11","Type":"ContainerDied","Data":"444374ca1e3de85d55d7e6fff65f20283878b893c1b056a0b39ba9c8956684eb"} Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.284298 4725 scope.go:117] "RemoveContainer" containerID="a2015d783012eb40f3a5917c9c739000fc14f1a0864079acfb13cd89bf9b3a4a" Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.284660 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="cinder-scheduler" containerID="cri-o://7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb" gracePeriod=30 Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.284823 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="probe" containerID="cri-o://b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1" gracePeriod=30 Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.309710 4725 scope.go:117] "RemoveContainer" containerID="265436abc44d2f8a2afc887c02aa8aa8de69f672d6e37798fc875d1ed616e468" Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.373589 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bbccbc7cf-sts7s"] Feb 27 06:32:34 crc kubenswrapper[4725]: I0227 06:32:34.385576 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bbccbc7cf-sts7s"] Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.298990 4725 generic.go:334] "Generic (PLEG): container finished" podID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerID="b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1" exitCode=0 Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.299329 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aae62a22-d2ec-4af6-9e82-b7a3aafb9188","Type":"ContainerDied","Data":"b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1"} Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.471504 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.473753 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.704622 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-745fdc9fb8-jhz6h"] Feb 27 06:32:35 crc kubenswrapper[4725]: E0227 06:32:35.705004 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-api" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.705017 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-api" Feb 27 06:32:35 crc kubenswrapper[4725]: E0227 06:32:35.705029 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-httpd" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.705036 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-httpd" Feb 27 06:32:35 crc kubenswrapper[4725]: E0227 06:32:35.705047 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerName="init" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.705055 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerName="init" Feb 27 06:32:35 crc kubenswrapper[4725]: E0227 06:32:35.705069 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerName="dnsmasq-dns" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.705075 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerName="dnsmasq-dns" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.705244 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-httpd" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.705253 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f17b9-c37d-4cc1-a611-9500bae66f11" containerName="dnsmasq-dns" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.705273 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4598bdbc-f18b-4709-baa9-013d097a4dfc" containerName="neutron-api" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.706274 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.721265 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-745fdc9fb8-jhz6h"] Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.842537 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f553c85-a79e-4317-9140-708bda9525e2-logs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.842603 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-config-data\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.842666 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-internal-tls-certs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.842691 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-combined-ca-bundle\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.842740 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrdt\" (UniqueName: \"kubernetes.io/projected/3f553c85-a79e-4317-9140-708bda9525e2-kube-api-access-6rrdt\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.842801 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-public-tls-certs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.842824 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-scripts\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.944772 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-internal-tls-certs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.945139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-combined-ca-bundle\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.945445 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rrdt\" (UniqueName: \"kubernetes.io/projected/3f553c85-a79e-4317-9140-708bda9525e2-kube-api-access-6rrdt\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.945676 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-public-tls-certs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.945903 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-scripts\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.946099 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f553c85-a79e-4317-9140-708bda9525e2-logs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.946330 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-config-data\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.946859 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f553c85-a79e-4317-9140-708bda9525e2-logs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.951879 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-scripts\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.952921 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-config-data\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.953544 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-combined-ca-bundle\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.956102 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-public-tls-certs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.968832 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f553c85-a79e-4317-9140-708bda9525e2-internal-tls-certs\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.987271 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rrdt\" (UniqueName: \"kubernetes.io/projected/3f553c85-a79e-4317-9140-708bda9525e2-kube-api-access-6rrdt\") pod \"placement-745fdc9fb8-jhz6h\" (UID: \"3f553c85-a79e-4317-9140-708bda9525e2\") " pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:35 crc kubenswrapper[4725]: I0227 06:32:35.994634 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.022548 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.252072 4725 scope.go:117] "RemoveContainer" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.263813 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480f17b9-c37d-4cc1-a611-9500bae66f11" path="/var/lib/kubelet/pods/480f17b9-c37d-4cc1-a611-9500bae66f11/volumes" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.321624 4725 generic.go:334] "Generic (PLEG): container finished" podID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerID="7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb" exitCode=0 Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.322692 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aae62a22-d2ec-4af6-9e82-b7a3aafb9188","Type":"ContainerDied","Data":"7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb"} Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.486983 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.589781 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-scripts\") pod \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.589965 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data\") pod \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.590012 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-combined-ca-bundle\") pod \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.590166 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-etc-machine-id\") pod \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.590253 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aae62a22-d2ec-4af6-9e82-b7a3aafb9188" (UID: "aae62a22-d2ec-4af6-9e82-b7a3aafb9188"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.590377 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data-custom\") pod \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.590432 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xd8n\" (UniqueName: \"kubernetes.io/projected/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-kube-api-access-5xd8n\") pod \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\" (UID: \"aae62a22-d2ec-4af6-9e82-b7a3aafb9188\") " Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.591527 4725 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.595371 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aae62a22-d2ec-4af6-9e82-b7a3aafb9188" (UID: "aae62a22-d2ec-4af6-9e82-b7a3aafb9188"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.596385 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-scripts" (OuterVolumeSpecName: "scripts") pod "aae62a22-d2ec-4af6-9e82-b7a3aafb9188" (UID: "aae62a22-d2ec-4af6-9e82-b7a3aafb9188"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.599850 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-kube-api-access-5xd8n" (OuterVolumeSpecName: "kube-api-access-5xd8n") pod "aae62a22-d2ec-4af6-9e82-b7a3aafb9188" (UID: "aae62a22-d2ec-4af6-9e82-b7a3aafb9188"). InnerVolumeSpecName "kube-api-access-5xd8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.601440 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-745fdc9fb8-jhz6h"] Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.644386 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aae62a22-d2ec-4af6-9e82-b7a3aafb9188" (UID: "aae62a22-d2ec-4af6-9e82-b7a3aafb9188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.693032 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.693063 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.693072 4725 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.693080 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xd8n\" (UniqueName: \"kubernetes.io/projected/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-kube-api-access-5xd8n\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.733531 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data" (OuterVolumeSpecName: "config-data") pod "aae62a22-d2ec-4af6-9e82-b7a3aafb9188" (UID: "aae62a22-d2ec-4af6-9e82-b7a3aafb9188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:36 crc kubenswrapper[4725]: I0227 06:32:36.795549 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aae62a22-d2ec-4af6-9e82-b7a3aafb9188-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.331011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerStarted","Data":"fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da"} Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.333010 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aae62a22-d2ec-4af6-9e82-b7a3aafb9188","Type":"ContainerDied","Data":"9beb7290dd3b0aad9b0cf014519a3bf3173d0599bd81577bf8a3dc35f36b6cb5"} Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.333044 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.333054 4725 scope.go:117] "RemoveContainer" containerID="b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.335042 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-745fdc9fb8-jhz6h" event={"ID":"3f553c85-a79e-4317-9140-708bda9525e2","Type":"ContainerStarted","Data":"9527295db645a13518eb16423392ce7a5b34f85ccee0aee606eeec82308baeea"} Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.335065 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-745fdc9fb8-jhz6h" event={"ID":"3f553c85-a79e-4317-9140-708bda9525e2","Type":"ContainerStarted","Data":"012f4cf58adde928486e1f51e291fbe7e6f713b81e409bccf8dcbc67bf61a547"} Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.335075 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-745fdc9fb8-jhz6h" event={"ID":"3f553c85-a79e-4317-9140-708bda9525e2","Type":"ContainerStarted","Data":"efdcd233d24968603c2cd0ed69eaf2dff6525d7bf362c2b3de0a3666c0f89a9d"} Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.335182 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.357681 4725 scope.go:117] "RemoveContainer" containerID="7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.396603 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-745fdc9fb8-jhz6h" podStartSLOduration=2.396578156 podStartE2EDuration="2.396578156s" podCreationTimestamp="2026-02-27 06:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:37.387132201 +0000 UTC m=+1335.849752790" watchObservedRunningTime="2026-02-27 06:32:37.396578156 +0000 UTC m=+1335.859198725" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.449670 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.457932 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.465675 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:37 crc kubenswrapper[4725]: E0227 06:32:37.466237 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="probe" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.466262 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="probe" Feb 27 06:32:37 crc kubenswrapper[4725]: E0227 06:32:37.466277 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="cinder-scheduler" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.466303 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="cinder-scheduler" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.466560 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="probe" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.466579 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" containerName="cinder-scheduler" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.467931 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.469682 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.473380 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.614165 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.614321 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-config-data\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.614409 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.614477 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b71d8cfd-c55f-43fd-b7b7-90c063488103-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.614534 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58jd\" (UniqueName: \"kubernetes.io/projected/b71d8cfd-c55f-43fd-b7b7-90c063488103-kube-api-access-c58jd\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.614575 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-scripts\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.716316 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.716444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-config-data\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.716598 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.716649 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b71d8cfd-c55f-43fd-b7b7-90c063488103-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.716685 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c58jd\" (UniqueName: \"kubernetes.io/projected/b71d8cfd-c55f-43fd-b7b7-90c063488103-kube-api-access-c58jd\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.716748 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-scripts\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.716925 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b71d8cfd-c55f-43fd-b7b7-90c063488103-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.721485 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.725362 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-scripts\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.736617 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.737655 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b71d8cfd-c55f-43fd-b7b7-90c063488103-config-data\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.739763 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58jd\" (UniqueName: \"kubernetes.io/projected/b71d8cfd-c55f-43fd-b7b7-90c063488103-kube-api-access-c58jd\") pod \"cinder-scheduler-0\" (UID: \"b71d8cfd-c55f-43fd-b7b7-90c063488103\") " pod="openstack/cinder-scheduler-0" Feb 27 06:32:37 crc kubenswrapper[4725]: I0227 06:32:37.788199 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 06:32:38 crc kubenswrapper[4725]: I0227 06:32:38.266552 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae62a22-d2ec-4af6-9e82-b7a3aafb9188" path="/var/lib/kubelet/pods/aae62a22-d2ec-4af6-9e82-b7a3aafb9188/volumes" Feb 27 06:32:38 crc kubenswrapper[4725]: I0227 06:32:38.320689 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 06:32:38 crc kubenswrapper[4725]: I0227 06:32:38.346779 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b71d8cfd-c55f-43fd-b7b7-90c063488103","Type":"ContainerStarted","Data":"5d3050ab460844404198f77039887d840f9ec1bb26773cab6b09c21f8f7e6835"} Feb 27 06:32:38 crc kubenswrapper[4725]: I0227 06:32:38.347011 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.017054 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.123874 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7959dbc8c4-8fc74" Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.185135 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8547558994-qzqlz"] Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.185396 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8547558994-qzqlz" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api-log" containerID="cri-o://f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed" gracePeriod=30 Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.185462 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8547558994-qzqlz" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api" containerID="cri-o://db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a" gracePeriod=30 Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.407689 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b71d8cfd-c55f-43fd-b7b7-90c063488103","Type":"ContainerStarted","Data":"d737ef9abcdafbade5d6d95b69bf6ad0a718e43438c9d88cd4ec141ebcae4221"} Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.413240 4725 generic.go:334] "Generic (PLEG): container finished" podID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerID="f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed" exitCode=143 Feb 27 06:32:39 crc kubenswrapper[4725]: I0227 06:32:39.413369 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8547558994-qzqlz" event={"ID":"38a3a342-d88d-4c78-a3d0-19755678d48b","Type":"ContainerDied","Data":"f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed"} Feb 27 06:32:40 crc kubenswrapper[4725]: I0227 06:32:40.424188 4725 generic.go:334] "Generic (PLEG): container finished" podID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerID="fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da" exitCode=1 Feb 27 06:32:40 crc kubenswrapper[4725]: I0227 06:32:40.424536 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerDied","Data":"fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da"} Feb 27 06:32:40 crc kubenswrapper[4725]: I0227 06:32:40.424693 4725 scope.go:117] "RemoveContainer" containerID="768c5bbd8389c44843dea61b5b10b91931c50b2680e1c22d333ac27e72999e5e" Feb 27 06:32:40 crc kubenswrapper[4725]: I0227 06:32:40.425727 4725 scope.go:117] "RemoveContainer" containerID="fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da" Feb 27 06:32:40 crc kubenswrapper[4725]: E0227 06:32:40.426447 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(98e9aa25-5670-466b-92a2-26b711b3ccf4)\"" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" Feb 27 06:32:40 crc kubenswrapper[4725]: I0227 06:32:40.427994 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b71d8cfd-c55f-43fd-b7b7-90c063488103","Type":"ContainerStarted","Data":"784a839fd888819770434a232b15f1f3546001a6346b63e5c661e13d7d83b7fa"} Feb 27 06:32:40 crc kubenswrapper[4725]: I0227 06:32:40.840313 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8547558994-qzqlz" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.184:9311/healthcheck\": read tcp 10.217.0.2:41444->10.217.0.184:9311: read: connection reset by peer" Feb 27 06:32:40 crc kubenswrapper[4725]: I0227 06:32:40.840274 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8547558994-qzqlz" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.184:9311/healthcheck\": read tcp 10.217.0.2:41454->10.217.0.184:9311: read: connection reset by peer" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.247459 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.275559 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.275541308 podStartE2EDuration="4.275541308s" podCreationTimestamp="2026-02-27 06:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:40.485976082 +0000 UTC m=+1338.948596661" watchObservedRunningTime="2026-02-27 06:32:41.275541308 +0000 UTC m=+1339.738161878" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.415261 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data-custom\") pod \"38a3a342-d88d-4c78-a3d0-19755678d48b\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.415415 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38a3a342-d88d-4c78-a3d0-19755678d48b-logs\") pod \"38a3a342-d88d-4c78-a3d0-19755678d48b\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.415462 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-combined-ca-bundle\") pod \"38a3a342-d88d-4c78-a3d0-19755678d48b\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.415529 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data\") pod \"38a3a342-d88d-4c78-a3d0-19755678d48b\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.415579 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw25r\" (UniqueName: \"kubernetes.io/projected/38a3a342-d88d-4c78-a3d0-19755678d48b-kube-api-access-tw25r\") pod \"38a3a342-d88d-4c78-a3d0-19755678d48b\" (UID: \"38a3a342-d88d-4c78-a3d0-19755678d48b\") " Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.417360 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a3a342-d88d-4c78-a3d0-19755678d48b-logs" (OuterVolumeSpecName: "logs") pod "38a3a342-d88d-4c78-a3d0-19755678d48b" (UID: "38a3a342-d88d-4c78-a3d0-19755678d48b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.423610 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "38a3a342-d88d-4c78-a3d0-19755678d48b" (UID: "38a3a342-d88d-4c78-a3d0-19755678d48b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.425853 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a3a342-d88d-4c78-a3d0-19755678d48b-kube-api-access-tw25r" (OuterVolumeSpecName: "kube-api-access-tw25r") pod "38a3a342-d88d-4c78-a3d0-19755678d48b" (UID: "38a3a342-d88d-4c78-a3d0-19755678d48b"). InnerVolumeSpecName "kube-api-access-tw25r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.437845 4725 generic.go:334] "Generic (PLEG): container finished" podID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerID="db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a" exitCode=0 Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.437944 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8547558994-qzqlz" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.438549 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8547558994-qzqlz" event={"ID":"38a3a342-d88d-4c78-a3d0-19755678d48b","Type":"ContainerDied","Data":"db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a"} Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.438593 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8547558994-qzqlz" event={"ID":"38a3a342-d88d-4c78-a3d0-19755678d48b","Type":"ContainerDied","Data":"6d5e1b3f853a9dfe1ba1484108665dc97f315001c3b0956a28fdfd19e4153597"} Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.438612 4725 scope.go:117] "RemoveContainer" containerID="db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.460504 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38a3a342-d88d-4c78-a3d0-19755678d48b" (UID: "38a3a342-d88d-4c78-a3d0-19755678d48b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.506524 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data" (OuterVolumeSpecName: "config-data") pod "38a3a342-d88d-4c78-a3d0-19755678d48b" (UID: "38a3a342-d88d-4c78-a3d0-19755678d48b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.520570 4725 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.521756 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38a3a342-d88d-4c78-a3d0-19755678d48b-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.521793 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.521816 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a3a342-d88d-4c78-a3d0-19755678d48b-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.521835 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw25r\" (UniqueName: \"kubernetes.io/projected/38a3a342-d88d-4c78-a3d0-19755678d48b-kube-api-access-tw25r\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.566096 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6fccbd6487-2trpv" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.619840 4725 scope.go:117] "RemoveContainer" containerID="f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.672900 4725 scope.go:117] "RemoveContainer" containerID="db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a" Feb 27 06:32:41 crc kubenswrapper[4725]: E0227 06:32:41.673770 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a\": container with ID starting with db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a not found: ID does not exist" containerID="db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.673807 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a"} err="failed to get container status \"db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a\": rpc error: code = NotFound desc = could not find container \"db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a\": container with ID starting with db44755de196604fc02dd73322c8f1cfcc3339d6ea0c3c54a69b305731bb7b1a not found: ID does not exist" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.673833 4725 scope.go:117] "RemoveContainer" containerID="f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed" Feb 27 06:32:41 crc kubenswrapper[4725]: E0227 06:32:41.676961 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed\": container with ID starting with f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed not found: ID does not exist" containerID="f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.677013 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed"} err="failed to get container status \"f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed\": rpc error: code = NotFound desc = could not find container \"f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed\": container with ID starting with f5c17b1b2862e90e650ae423c79d6086615a87408641a88de155e5502eced3ed not found: ID does not exist" Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.773992 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8547558994-qzqlz"] Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.787775 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8547558994-qzqlz"] Feb 27 06:32:41 crc kubenswrapper[4725]: I0227 06:32:41.838161 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-56865cdb4-9hs85" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Feb 27 06:32:42 crc kubenswrapper[4725]: I0227 06:32:42.264881 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" path="/var/lib/kubelet/pods/38a3a342-d88d-4c78-a3d0-19755678d48b/volumes" Feb 27 06:32:42 crc kubenswrapper[4725]: I0227 06:32:42.492167 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:42 crc kubenswrapper[4725]: I0227 06:32:42.492256 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:42 crc kubenswrapper[4725]: I0227 06:32:42.493405 4725 scope.go:117] "RemoveContainer" containerID="fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da" Feb 27 06:32:42 crc kubenswrapper[4725]: E0227 06:32:42.493905 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(98e9aa25-5670-466b-92a2-26b711b3ccf4)\"" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" Feb 27 06:32:42 crc kubenswrapper[4725]: I0227 06:32:42.788606 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.442522 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 27 06:32:45 crc kubenswrapper[4725]: E0227 06:32:45.443265 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.443279 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api" Feb 27 06:32:45 crc kubenswrapper[4725]: E0227 06:32:45.443309 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api-log" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.443315 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api-log" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.443546 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.443557 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a3a342-d88d-4c78-a3d0-19755678d48b" containerName="barbican-api-log" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.444369 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.449146 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.449193 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.451181 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p4r2g" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.458855 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.605508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.606258 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config-secret\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.606468 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.606806 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4z6d\" (UniqueName: \"kubernetes.io/projected/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-kube-api-access-x4z6d\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.708813 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.708880 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config-secret\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.708956 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.709006 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4z6d\" (UniqueName: \"kubernetes.io/projected/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-kube-api-access-x4z6d\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.710908 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.720418 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config-secret\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.720904 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.728064 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4z6d\" (UniqueName: \"kubernetes.io/projected/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-kube-api-access-x4z6d\") pod \"openstackclient\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.784911 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.836336 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.848617 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.919848 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.922828 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 06:32:45 crc kubenswrapper[4725]: I0227 06:32:45.943423 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 06:32:46 crc kubenswrapper[4725]: E0227 06:32:46.001133 4725 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 06:32:46 crc kubenswrapper[4725]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_31e45fa6-7b5e-42ef-ac78-8d13906f7abc_0(ad521e73039a24c283ea0ad44723537f4632e9807dc49a4a7e0bd6a216913554): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ad521e73039a24c283ea0ad44723537f4632e9807dc49a4a7e0bd6a216913554" Netns:"/var/run/netns/e640f81b-eeaf-42aa-b6cf-fa25b54b3014" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ad521e73039a24c283ea0ad44723537f4632e9807dc49a4a7e0bd6a216913554;K8S_POD_UID=31e45fa6-7b5e-42ef-ac78-8d13906f7abc" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/31e45fa6-7b5e-42ef-ac78-8d13906f7abc]: expected pod UID "31e45fa6-7b5e-42ef-ac78-8d13906f7abc" but got "6c9af008-ad8e-4eaa-b631-543a0ef1bb00" from Kube API Feb 27 06:32:46 crc kubenswrapper[4725]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 06:32:46 crc kubenswrapper[4725]: > Feb 27 06:32:46 crc kubenswrapper[4725]: E0227 06:32:46.001202 4725 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 06:32:46 crc kubenswrapper[4725]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_31e45fa6-7b5e-42ef-ac78-8d13906f7abc_0(ad521e73039a24c283ea0ad44723537f4632e9807dc49a4a7e0bd6a216913554): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ad521e73039a24c283ea0ad44723537f4632e9807dc49a4a7e0bd6a216913554" Netns:"/var/run/netns/e640f81b-eeaf-42aa-b6cf-fa25b54b3014" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ad521e73039a24c283ea0ad44723537f4632e9807dc49a4a7e0bd6a216913554;K8S_POD_UID=31e45fa6-7b5e-42ef-ac78-8d13906f7abc" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/31e45fa6-7b5e-42ef-ac78-8d13906f7abc]: expected pod UID "31e45fa6-7b5e-42ef-ac78-8d13906f7abc" but got "6c9af008-ad8e-4eaa-b631-543a0ef1bb00" from Kube API Feb 27 06:32:46 crc kubenswrapper[4725]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 06:32:46 crc kubenswrapper[4725]: > pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.026508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6tdc\" (UniqueName: \"kubernetes.io/projected/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-kube-api-access-g6tdc\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.026693 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.026734 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-openstack-config\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.026764 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.130753 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.131071 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-openstack-config\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.131117 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.131151 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6tdc\" (UniqueName: \"kubernetes.io/projected/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-kube-api-access-g6tdc\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.132241 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-openstack-config\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.135095 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.138078 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.152476 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6tdc\" (UniqueName: \"kubernetes.io/projected/6c9af008-ad8e-4eaa-b631-543a0ef1bb00-kube-api-access-g6tdc\") pod \"openstackclient\" (UID: \"6c9af008-ad8e-4eaa-b631-543a0ef1bb00\") " pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.229122 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.302230 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.335349 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-tls-certs\") pod \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.335466 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-combined-ca-bundle\") pod \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.335581 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4fc5fc3-880a-46c5-a0a1-3248884d9882-logs\") pod \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.335632 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqsmq\" (UniqueName: \"kubernetes.io/projected/a4fc5fc3-880a-46c5-a0a1-3248884d9882-kube-api-access-vqsmq\") pod \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.335668 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-config-data\") pod \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.335704 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-scripts\") pod \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.335742 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-secret-key\") pod \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\" (UID: \"a4fc5fc3-880a-46c5-a0a1-3248884d9882\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.336111 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4fc5fc3-880a-46c5-a0a1-3248884d9882-logs" (OuterVolumeSpecName: "logs") pod "a4fc5fc3-880a-46c5-a0a1-3248884d9882" (UID: "a4fc5fc3-880a-46c5-a0a1-3248884d9882"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.344408 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a4fc5fc3-880a-46c5-a0a1-3248884d9882" (UID: "a4fc5fc3-880a-46c5-a0a1-3248884d9882"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.344546 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fc5fc3-880a-46c5-a0a1-3248884d9882-kube-api-access-vqsmq" (OuterVolumeSpecName: "kube-api-access-vqsmq") pod "a4fc5fc3-880a-46c5-a0a1-3248884d9882" (UID: "a4fc5fc3-880a-46c5-a0a1-3248884d9882"). InnerVolumeSpecName "kube-api-access-vqsmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.361523 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-scripts" (OuterVolumeSpecName: "scripts") pod "a4fc5fc3-880a-46c5-a0a1-3248884d9882" (UID: "a4fc5fc3-880a-46c5-a0a1-3248884d9882"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.361545 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-config-data" (OuterVolumeSpecName: "config-data") pod "a4fc5fc3-880a-46c5-a0a1-3248884d9882" (UID: "a4fc5fc3-880a-46c5-a0a1-3248884d9882"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.370470 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4fc5fc3-880a-46c5-a0a1-3248884d9882" (UID: "a4fc5fc3-880a-46c5-a0a1-3248884d9882"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.403434 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a4fc5fc3-880a-46c5-a0a1-3248884d9882" (UID: "a4fc5fc3-880a-46c5-a0a1-3248884d9882"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.438408 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.438506 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4fc5fc3-880a-46c5-a0a1-3248884d9882-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.438520 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.438533 4725 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.438544 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fc5fc3-880a-46c5-a0a1-3248884d9882-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.438553 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4fc5fc3-880a-46c5-a0a1-3248884d9882-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.438561 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqsmq\" (UniqueName: \"kubernetes.io/projected/a4fc5fc3-880a-46c5-a0a1-3248884d9882-kube-api-access-vqsmq\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.514940 4725 generic.go:334] "Generic (PLEG): container finished" podID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerID="0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370" exitCode=137 Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.515004 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.515609 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56865cdb4-9hs85" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.515730 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56865cdb4-9hs85" event={"ID":"a4fc5fc3-880a-46c5-a0a1-3248884d9882","Type":"ContainerDied","Data":"0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370"} Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.515785 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56865cdb4-9hs85" event={"ID":"a4fc5fc3-880a-46c5-a0a1-3248884d9882","Type":"ContainerDied","Data":"d7b6399bad48d193a2b88629e2ab802c17ead92bb73cce34440180ab7c5d8b73"} Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.515802 4725 scope.go:117] "RemoveContainer" containerID="07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.523858 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.527699 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="31e45fa6-7b5e-42ef-ac78-8d13906f7abc" podUID="6c9af008-ad8e-4eaa-b631-543a0ef1bb00" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.539612 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-combined-ca-bundle\") pod \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.539650 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4z6d\" (UniqueName: \"kubernetes.io/projected/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-kube-api-access-x4z6d\") pod \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.539702 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config\") pod \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.539719 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config-secret\") pod \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\" (UID: \"31e45fa6-7b5e-42ef-ac78-8d13906f7abc\") " Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.540264 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "31e45fa6-7b5e-42ef-ac78-8d13906f7abc" (UID: "31e45fa6-7b5e-42ef-ac78-8d13906f7abc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.547266 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56865cdb4-9hs85"] Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.550085 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-kube-api-access-x4z6d" (OuterVolumeSpecName: "kube-api-access-x4z6d") pod "31e45fa6-7b5e-42ef-ac78-8d13906f7abc" (UID: "31e45fa6-7b5e-42ef-ac78-8d13906f7abc"). InnerVolumeSpecName "kube-api-access-x4z6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.552895 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31e45fa6-7b5e-42ef-ac78-8d13906f7abc" (UID: "31e45fa6-7b5e-42ef-ac78-8d13906f7abc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.554540 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-56865cdb4-9hs85"] Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.558325 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "31e45fa6-7b5e-42ef-ac78-8d13906f7abc" (UID: "31e45fa6-7b5e-42ef-ac78-8d13906f7abc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.641600 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.641634 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.641645 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.641654 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4z6d\" (UniqueName: \"kubernetes.io/projected/31e45fa6-7b5e-42ef-ac78-8d13906f7abc-kube-api-access-x4z6d\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.680978 4725 scope.go:117] "RemoveContainer" containerID="0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.698968 4725 scope.go:117] "RemoveContainer" containerID="07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998" Feb 27 06:32:46 crc kubenswrapper[4725]: E0227 06:32:46.699407 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998\": container with ID starting with 07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998 not found: ID does not exist" containerID="07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.699444 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998"} err="failed to get container status \"07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998\": rpc error: code = NotFound desc = could not find container \"07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998\": container with ID starting with 07d00c503181949528afc8a49e697affeef2db076a7eb2d1a712633c517c4998 not found: ID does not exist" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.699469 4725 scope.go:117] "RemoveContainer" containerID="0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370" Feb 27 06:32:46 crc kubenswrapper[4725]: E0227 06:32:46.699776 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370\": container with ID starting with 0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370 not found: ID does not exist" containerID="0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.699801 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370"} err="failed to get container status \"0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370\": rpc error: code = NotFound desc = could not find container \"0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370\": container with ID starting with 0282bff9a70b60f8f223e4e6c0b3d2bff2c0bd72720586ed7a7aef8f05997370 not found: ID does not exist" Feb 27 06:32:46 crc kubenswrapper[4725]: I0227 06:32:46.784888 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 06:32:47 crc kubenswrapper[4725]: I0227 06:32:47.523913 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6c9af008-ad8e-4eaa-b631-543a0ef1bb00","Type":"ContainerStarted","Data":"27e803db9d7b026156b8fe0ba7ded1f451f43cac35b7aca1be2b817021f87bf8"} Feb 27 06:32:47 crc kubenswrapper[4725]: I0227 06:32:47.525563 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 06:32:47 crc kubenswrapper[4725]: I0227 06:32:47.552993 4725 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="31e45fa6-7b5e-42ef-ac78-8d13906f7abc" podUID="6c9af008-ad8e-4eaa-b631-543a0ef1bb00" Feb 27 06:32:47 crc kubenswrapper[4725]: I0227 06:32:47.982506 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.266277 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e45fa6-7b5e-42ef-ac78-8d13906f7abc" path="/var/lib/kubelet/pods/31e45fa6-7b5e-42ef-ac78-8d13906f7abc/volumes" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.267277 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" path="/var/lib/kubelet/pods/a4fc5fc3-880a-46c5-a0a1-3248884d9882/volumes" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.551272 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-559f68776c-7cj2d"] Feb 27 06:32:48 crc kubenswrapper[4725]: E0227 06:32:48.552082 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon-log" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.552118 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon-log" Feb 27 06:32:48 crc kubenswrapper[4725]: E0227 06:32:48.552182 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.552202 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.552615 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.552643 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fc5fc3-880a-46c5-a0a1-3248884d9882" containerName="horizon-log" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.554809 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.558525 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.558552 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.558868 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.571519 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-559f68776c-7cj2d"] Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.692660 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-config-data\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.692742 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-public-tls-certs\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.692794 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39aed367-30f0-4ebd-a057-e33e50a6f748-log-httpd\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.692818 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-internal-tls-certs\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.692954 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-combined-ca-bundle\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.693037 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39aed367-30f0-4ebd-a057-e33e50a6f748-etc-swift\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.693209 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39aed367-30f0-4ebd-a057-e33e50a6f748-run-httpd\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.693239 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5ck\" (UniqueName: \"kubernetes.io/projected/39aed367-30f0-4ebd-a057-e33e50a6f748-kube-api-access-rs5ck\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.795010 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39aed367-30f0-4ebd-a057-e33e50a6f748-run-httpd\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.795066 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5ck\" (UniqueName: \"kubernetes.io/projected/39aed367-30f0-4ebd-a057-e33e50a6f748-kube-api-access-rs5ck\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.795211 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-config-data\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.795251 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-public-tls-certs\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.795275 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39aed367-30f0-4ebd-a057-e33e50a6f748-log-httpd\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.795369 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-internal-tls-certs\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.795859 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-combined-ca-bundle\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.796526 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39aed367-30f0-4ebd-a057-e33e50a6f748-run-httpd\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.796649 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39aed367-30f0-4ebd-a057-e33e50a6f748-etc-swift\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.796778 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39aed367-30f0-4ebd-a057-e33e50a6f748-log-httpd\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.802890 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-combined-ca-bundle\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.803598 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-internal-tls-certs\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.810053 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-public-tls-certs\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.811021 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39aed367-30f0-4ebd-a057-e33e50a6f748-etc-swift\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.811670 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39aed367-30f0-4ebd-a057-e33e50a6f748-config-data\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.815080 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5ck\" (UniqueName: \"kubernetes.io/projected/39aed367-30f0-4ebd-a057-e33e50a6f748-kube-api-access-rs5ck\") pod \"swift-proxy-559f68776c-7cj2d\" (UID: \"39aed367-30f0-4ebd-a057-e33e50a6f748\") " pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:48 crc kubenswrapper[4725]: I0227 06:32:48.891453 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:49 crc kubenswrapper[4725]: I0227 06:32:49.523022 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-559f68776c-7cj2d"] Feb 27 06:32:49 crc kubenswrapper[4725]: I0227 06:32:49.560767 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-559f68776c-7cj2d" event={"ID":"39aed367-30f0-4ebd-a057-e33e50a6f748","Type":"ContainerStarted","Data":"f50fdf524e447e0f362fcc8f6d038f12514581616693b5166c8e4ec312b4da65"} Feb 27 06:32:49 crc kubenswrapper[4725]: W0227 06:32:49.827886 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf2f6a7b_ad66_4ad0_983d_9fe66a70e20b.slice/crio-91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120.scope WatchSource:0}: Error finding container 91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120: Status 404 returned error can't find the container with id 91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120 Feb 27 06:32:49 crc kubenswrapper[4725]: W0227 06:32:49.841112 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-conmon-ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-conmon-ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756.scope: no such file or directory Feb 27 06:32:49 crc kubenswrapper[4725]: W0227 06:32:49.841174 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756.scope: no such file or directory Feb 27 06:32:49 crc kubenswrapper[4725]: W0227 06:32:49.841196 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-conmon-7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-conmon-7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb.scope: no such file or directory Feb 27 06:32:49 crc kubenswrapper[4725]: W0227 06:32:49.841213 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-7fa984590a6f1576250926da3a35ef04ac3d222473497141c2bd024dd1c9e4bb.scope: no such file or directory Feb 27 06:32:50 crc kubenswrapper[4725]: E0227 06:32:50.061181 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf2f6a7b_ad66_4ad0_983d_9fe66a70e20b.slice/crio-conmon-91d1f85a51c0039b452db8aed72f105c62a7f44aff64174505f5ee0a25f2a120.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4fc5fc3_880a_46c5_a0a1_3248884d9882.slice/crio-d7b6399bad48d193a2b88629e2ab802c17ead92bb73cce34440180ab7c5d8b73\": RecentStats: unable to find data in memory cache]" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.241749 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.327634 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-scripts\") pod \"f817188c-5563-4b93-abe7-94305a5c95a9\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.327685 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-run-httpd\") pod \"f817188c-5563-4b93-abe7-94305a5c95a9\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.327720 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-config-data\") pod \"f817188c-5563-4b93-abe7-94305a5c95a9\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.327813 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-sg-core-conf-yaml\") pod \"f817188c-5563-4b93-abe7-94305a5c95a9\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.327878 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-log-httpd\") pod \"f817188c-5563-4b93-abe7-94305a5c95a9\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.327943 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp94q\" (UniqueName: \"kubernetes.io/projected/f817188c-5563-4b93-abe7-94305a5c95a9-kube-api-access-bp94q\") pod \"f817188c-5563-4b93-abe7-94305a5c95a9\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.328013 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-combined-ca-bundle\") pod \"f817188c-5563-4b93-abe7-94305a5c95a9\" (UID: \"f817188c-5563-4b93-abe7-94305a5c95a9\") " Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.329384 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f817188c-5563-4b93-abe7-94305a5c95a9" (UID: "f817188c-5563-4b93-abe7-94305a5c95a9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.329653 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f817188c-5563-4b93-abe7-94305a5c95a9" (UID: "f817188c-5563-4b93-abe7-94305a5c95a9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.335209 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f817188c-5563-4b93-abe7-94305a5c95a9-kube-api-access-bp94q" (OuterVolumeSpecName: "kube-api-access-bp94q") pod "f817188c-5563-4b93-abe7-94305a5c95a9" (UID: "f817188c-5563-4b93-abe7-94305a5c95a9"). InnerVolumeSpecName "kube-api-access-bp94q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.339397 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-scripts" (OuterVolumeSpecName: "scripts") pod "f817188c-5563-4b93-abe7-94305a5c95a9" (UID: "f817188c-5563-4b93-abe7-94305a5c95a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.364100 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f817188c-5563-4b93-abe7-94305a5c95a9" (UID: "f817188c-5563-4b93-abe7-94305a5c95a9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.399202 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f817188c-5563-4b93-abe7-94305a5c95a9" (UID: "f817188c-5563-4b93-abe7-94305a5c95a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.415986 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-config-data" (OuterVolumeSpecName: "config-data") pod "f817188c-5563-4b93-abe7-94305a5c95a9" (UID: "f817188c-5563-4b93-abe7-94305a5c95a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.430581 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.430766 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.430826 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp94q\" (UniqueName: \"kubernetes.io/projected/f817188c-5563-4b93-abe7-94305a5c95a9-kube-api-access-bp94q\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.430881 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.430973 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f817188c-5563-4b93-abe7-94305a5c95a9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.431029 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.431089 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f817188c-5563-4b93-abe7-94305a5c95a9-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.570597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-559f68776c-7cj2d" event={"ID":"39aed367-30f0-4ebd-a057-e33e50a6f748","Type":"ContainerStarted","Data":"97d25421b5b697ae9e3ba1a147ad55d1aec2467039df8ed361e292c8ac329660"} Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.570651 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-559f68776c-7cj2d" event={"ID":"39aed367-30f0-4ebd-a057-e33e50a6f748","Type":"ContainerStarted","Data":"644004f78ea1e773619f364d9ac7739681ec08c30c3bbeb7a68662883b0e99b4"} Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.570867 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.570912 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.572934 4725 generic.go:334] "Generic (PLEG): container finished" podID="f817188c-5563-4b93-abe7-94305a5c95a9" containerID="d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3" exitCode=137 Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.572976 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerDied","Data":"d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3"} Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.573025 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f817188c-5563-4b93-abe7-94305a5c95a9","Type":"ContainerDied","Data":"5e912678bc8e972388309a7a24d0f78262f9d001cfcdb4ac35954685a13c3fd3"} Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.573047 4725 scope.go:117] "RemoveContainer" containerID="d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.572982 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.606817 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-559f68776c-7cj2d" podStartSLOduration=2.606796531 podStartE2EDuration="2.606796531s" podCreationTimestamp="2026-02-27 06:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:32:50.598319772 +0000 UTC m=+1349.060940361" watchObservedRunningTime="2026-02-27 06:32:50.606796531 +0000 UTC m=+1349.069417100" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.612313 4725 scope.go:117] "RemoveContainer" containerID="bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.645598 4725 scope.go:117] "RemoveContainer" containerID="8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.674764 4725 scope.go:117] "RemoveContainer" containerID="d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3" Feb 27 06:32:50 crc kubenswrapper[4725]: E0227 06:32:50.675444 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3\": container with ID starting with d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3 not found: ID does not exist" containerID="d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.675489 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3"} err="failed to get container status \"d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3\": rpc error: code = NotFound desc = could not find container \"d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3\": container with ID starting with d2ce93e70849bef1f144375456429d35404b111752d17cf1e2cf49077966f3e3 not found: ID does not exist" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.675517 4725 scope.go:117] "RemoveContainer" containerID="bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1" Feb 27 06:32:50 crc kubenswrapper[4725]: E0227 06:32:50.675909 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1\": container with ID starting with bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1 not found: ID does not exist" containerID="bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.675950 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1"} err="failed to get container status \"bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1\": rpc error: code = NotFound desc = could not find container \"bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1\": container with ID starting with bdde3437431d78932bc6da0e4e0e55ed6e265551f2d8b26ed6f8916706a9eff1 not found: ID does not exist" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.675976 4725 scope.go:117] "RemoveContainer" containerID="8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.675921 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:32:50 crc kubenswrapper[4725]: E0227 06:32:50.676808 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff\": container with ID starting with 8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff not found: ID does not exist" containerID="8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.677686 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff"} err="failed to get container status \"8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff\": rpc error: code = NotFound desc = could not find container \"8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff\": container with ID starting with 8fdf2479c3de4c06fabb6cbd0446ce3109bfff5e2bb09f423077c31540b524ff not found: ID does not exist" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.686219 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.699485 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:32:50 crc kubenswrapper[4725]: E0227 06:32:50.700007 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="proxy-httpd" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.700076 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="proxy-httpd" Feb 27 06:32:50 crc kubenswrapper[4725]: E0227 06:32:50.700161 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="ceilometer-notification-agent" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.700219 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="ceilometer-notification-agent" Feb 27 06:32:50 crc kubenswrapper[4725]: E0227 06:32:50.700281 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="sg-core" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.700370 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="sg-core" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.700762 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="proxy-httpd" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.700844 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="ceilometer-notification-agent" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.700904 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" containerName="sg-core" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.702564 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.704982 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.705144 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.718101 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.839268 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-run-httpd\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.840509 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.840842 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-log-httpd\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.841033 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.841320 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7njkx\" (UniqueName: \"kubernetes.io/projected/9359342b-953e-4419-baff-b26bab3404c4-kube-api-access-7njkx\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.841540 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-scripts\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.841743 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-config-data\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943134 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-log-httpd\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943199 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943235 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7njkx\" (UniqueName: \"kubernetes.io/projected/9359342b-953e-4419-baff-b26bab3404c4-kube-api-access-7njkx\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943330 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-scripts\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943364 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-config-data\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943450 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-run-httpd\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943765 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-log-httpd\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.943960 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.944246 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-run-httpd\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.947980 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.950054 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.950244 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-config-data\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.956679 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-scripts\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:50 crc kubenswrapper[4725]: I0227 06:32:50.960575 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7njkx\" (UniqueName: \"kubernetes.io/projected/9359342b-953e-4419-baff-b26bab3404c4-kube-api-access-7njkx\") pod \"ceilometer-0\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " pod="openstack/ceilometer-0" Feb 27 06:32:51 crc kubenswrapper[4725]: I0227 06:32:51.019186 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:32:51 crc kubenswrapper[4725]: I0227 06:32:51.122063 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:32:51 crc kubenswrapper[4725]: I0227 06:32:51.562090 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:32:51 crc kubenswrapper[4725]: I0227 06:32:51.614529 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerStarted","Data":"c92e66415e2a2886ece0a05d319ca89bdd74706066395807dd1c462fe0b74731"} Feb 27 06:32:52 crc kubenswrapper[4725]: I0227 06:32:52.274416 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f817188c-5563-4b93-abe7-94305a5c95a9" path="/var/lib/kubelet/pods/f817188c-5563-4b93-abe7-94305a5c95a9/volumes" Feb 27 06:32:52 crc kubenswrapper[4725]: I0227 06:32:52.493435 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:52 crc kubenswrapper[4725]: I0227 06:32:52.493483 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 27 06:32:52 crc kubenswrapper[4725]: I0227 06:32:52.494261 4725 scope.go:117] "RemoveContainer" containerID="fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da" Feb 27 06:32:52 crc kubenswrapper[4725]: E0227 06:32:52.494501 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(98e9aa25-5670-466b-92a2-26b711b3ccf4)\"" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" Feb 27 06:32:54 crc kubenswrapper[4725]: E0227 06:32:54.151185 4725 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/cf6db42f26e4df2689dc35f3f96c377701199c3d2603bd4f88fc6620e4b8f0c2/diff" to get inode usage: stat /var/lib/containers/storage/overlay/cf6db42f26e4df2689dc35f3f96c377701199c3d2603bd4f88fc6620e4b8f0c2/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_horizon-7cc5ddffd5-r9bpn_ba49d2a8-4ee2-4fe4-87dd-d875279c77cd/horizon-log/0.log" to get inode usage: stat /var/log/pods/openstack_horizon-7cc5ddffd5-r9bpn_ba49d2a8-4ee2-4fe4-87dd-d875279c77cd/horizon-log/0.log: no such file or directory Feb 27 06:32:54 crc kubenswrapper[4725]: E0227 06:32:54.417011 4725 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/5ce993d8810191977912b36db5c95e140421946defca5f8b02230ad3f30a2fb8/diff" to get inode usage: stat /var/lib/containers/storage/overlay/5ce993d8810191977912b36db5c95e140421946defca5f8b02230ad3f30a2fb8/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_horizon-56865cdb4-9hs85_a4fc5fc3-880a-46c5-a0a1-3248884d9882/horizon-log/0.log" to get inode usage: stat /var/log/pods/openstack_horizon-56865cdb4-9hs85_a4fc5fc3-880a-46c5-a0a1-3248884d9882/horizon-log/0.log: no such file or directory Feb 27 06:32:55 crc kubenswrapper[4725]: E0227 06:32:55.201440 4725 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7d7fa8e3a846684c448465b18be144e216e8bf0bb04b6f4b66aef5b9f5b28bac/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7d7fa8e3a846684c448465b18be144e216e8bf0bb04b6f4b66aef5b9f5b28bac/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_horizon-7cc5ddffd5-r9bpn_ba49d2a8-4ee2-4fe4-87dd-d875279c77cd/horizon/0.log" to get inode usage: stat /var/log/pods/openstack_horizon-7cc5ddffd5-r9bpn_ba49d2a8-4ee2-4fe4-87dd-d875279c77cd/horizon/0.log: no such file or directory Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.311780 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.312050 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-log" containerID="cri-o://2a038dd6688e01b1305a86954662a27fed8e771b340b874444f3b95a6a5be964" gracePeriod=30 Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.312185 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-httpd" containerID="cri-o://ddeb12551e0545cef274b108a5df028e84004c848959f496b5e492599e202081" gracePeriod=30 Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.391065 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mz255"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.392644 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.401758 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mz255"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.489706 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbns6\" (UniqueName: \"kubernetes.io/projected/c28e6b4f-6ef4-4e8f-9b40-366064eec781-kube-api-access-dbns6\") pod \"nova-api-db-create-mz255\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.489805 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28e6b4f-6ef4-4e8f-9b40-366064eec781-operator-scripts\") pod \"nova-api-db-create-mz255\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.500084 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-m6gzd"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.501667 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.519411 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-m6gzd"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.591889 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbns6\" (UniqueName: \"kubernetes.io/projected/c28e6b4f-6ef4-4e8f-9b40-366064eec781-kube-api-access-dbns6\") pod \"nova-api-db-create-mz255\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.591996 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q9gh\" (UniqueName: \"kubernetes.io/projected/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-kube-api-access-8q9gh\") pod \"nova-cell0-db-create-m6gzd\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.592023 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-operator-scripts\") pod \"nova-cell0-db-create-m6gzd\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.592048 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28e6b4f-6ef4-4e8f-9b40-366064eec781-operator-scripts\") pod \"nova-api-db-create-mz255\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.592861 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28e6b4f-6ef4-4e8f-9b40-366064eec781-operator-scripts\") pod \"nova-api-db-create-mz255\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.597834 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-46fc-account-create-update-sp56g"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.599078 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.603714 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.608046 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-46fc-account-create-update-sp56g"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.623043 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbns6\" (UniqueName: \"kubernetes.io/projected/c28e6b4f-6ef4-4e8f-9b40-366064eec781-kube-api-access-dbns6\") pod \"nova-api-db-create-mz255\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.664729 4725 generic.go:334] "Generic (PLEG): container finished" podID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerID="2a038dd6688e01b1305a86954662a27fed8e771b340b874444f3b95a6a5be964" exitCode=143 Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.664776 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66dd6dba-4ab7-4fa0-88a5-abdccb202492","Type":"ContainerDied","Data":"2a038dd6688e01b1305a86954662a27fed8e771b340b874444f3b95a6a5be964"} Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.686983 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tzhf5"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.688201 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.693746 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q9gh\" (UniqueName: \"kubernetes.io/projected/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-kube-api-access-8q9gh\") pod \"nova-cell0-db-create-m6gzd\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.693783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-operator-scripts\") pod \"nova-cell0-db-create-m6gzd\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.693863 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/903b7538-3fc0-4580-9bd0-adff6ce3f634-operator-scripts\") pod \"nova-api-46fc-account-create-update-sp56g\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.693926 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5z2\" (UniqueName: \"kubernetes.io/projected/903b7538-3fc0-4580-9bd0-adff6ce3f634-kube-api-access-6r5z2\") pod \"nova-api-46fc-account-create-update-sp56g\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.694539 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-operator-scripts\") pod \"nova-cell0-db-create-m6gzd\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.710742 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tzhf5"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.732376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q9gh\" (UniqueName: \"kubernetes.io/projected/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-kube-api-access-8q9gh\") pod \"nova-cell0-db-create-m6gzd\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.763261 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mz255" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.808482 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2e80d4-9806-45a7-b10e-91d387331e54-operator-scripts\") pod \"nova-cell1-db-create-tzhf5\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.808603 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5z2\" (UniqueName: \"kubernetes.io/projected/903b7538-3fc0-4580-9bd0-adff6ce3f634-kube-api-access-6r5z2\") pod \"nova-api-46fc-account-create-update-sp56g\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.808722 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6k5g\" (UniqueName: \"kubernetes.io/projected/6c2e80d4-9806-45a7-b10e-91d387331e54-kube-api-access-d6k5g\") pod \"nova-cell1-db-create-tzhf5\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.808761 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/903b7538-3fc0-4580-9bd0-adff6ce3f634-operator-scripts\") pod \"nova-api-46fc-account-create-update-sp56g\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.809537 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/903b7538-3fc0-4580-9bd0-adff6ce3f634-operator-scripts\") pod \"nova-api-46fc-account-create-update-sp56g\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.820325 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fd8c-account-create-update-tn48s"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.822417 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.831073 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.831974 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.842634 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5z2\" (UniqueName: \"kubernetes.io/projected/903b7538-3fc0-4580-9bd0-adff6ce3f634-kube-api-access-6r5z2\") pod \"nova-api-46fc-account-create-update-sp56g\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.851773 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fd8c-account-create-update-tn48s"] Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.914453 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af23271e-c19f-475c-8ff7-51e9bbe4471e-operator-scripts\") pod \"nova-cell0-fd8c-account-create-update-tn48s\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.914516 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6k5g\" (UniqueName: \"kubernetes.io/projected/6c2e80d4-9806-45a7-b10e-91d387331e54-kube-api-access-d6k5g\") pod \"nova-cell1-db-create-tzhf5\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.914588 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2e80d4-9806-45a7-b10e-91d387331e54-operator-scripts\") pod \"nova-cell1-db-create-tzhf5\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.914662 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4fk9\" (UniqueName: \"kubernetes.io/projected/af23271e-c19f-475c-8ff7-51e9bbe4471e-kube-api-access-d4fk9\") pod \"nova-cell0-fd8c-account-create-update-tn48s\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.915552 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2e80d4-9806-45a7-b10e-91d387331e54-operator-scripts\") pod \"nova-cell1-db-create-tzhf5\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.929603 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:32:56 crc kubenswrapper[4725]: I0227 06:32:56.952870 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6k5g\" (UniqueName: \"kubernetes.io/projected/6c2e80d4-9806-45a7-b10e-91d387331e54-kube-api-access-d6k5g\") pod \"nova-cell1-db-create-tzhf5\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.001806 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3cc6-account-create-update-8fxff"] Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.003817 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.007724 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.010087 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.011353 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3cc6-account-create-update-8fxff"] Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.017922 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af23271e-c19f-475c-8ff7-51e9bbe4471e-operator-scripts\") pod \"nova-cell0-fd8c-account-create-update-tn48s\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.018048 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4fk9\" (UniqueName: \"kubernetes.io/projected/af23271e-c19f-475c-8ff7-51e9bbe4471e-kube-api-access-d4fk9\") pod \"nova-cell0-fd8c-account-create-update-tn48s\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.019076 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af23271e-c19f-475c-8ff7-51e9bbe4471e-operator-scripts\") pod \"nova-cell0-fd8c-account-create-update-tn48s\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.032210 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b86d8c849-9kc54" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.050253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4fk9\" (UniqueName: \"kubernetes.io/projected/af23271e-c19f-475c-8ff7-51e9bbe4471e-kube-api-access-d4fk9\") pod \"nova-cell0-fd8c-account-create-update-tn48s\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.113146 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78d8db4dd4-8t58x"] Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.113425 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78d8db4dd4-8t58x" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-api" containerID="cri-o://e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90" gracePeriod=30 Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.113791 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78d8db4dd4-8t58x" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-httpd" containerID="cri-o://a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17" gracePeriod=30 Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.124879 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwqgq\" (UniqueName: \"kubernetes.io/projected/87486239-3017-44ed-ba9d-a28541bb2aca-kube-api-access-nwqgq\") pod \"nova-cell1-3cc6-account-create-update-8fxff\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.124999 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87486239-3017-44ed-ba9d-a28541bb2aca-operator-scripts\") pod \"nova-cell1-3cc6-account-create-update-8fxff\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.223693 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.237512 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwqgq\" (UniqueName: \"kubernetes.io/projected/87486239-3017-44ed-ba9d-a28541bb2aca-kube-api-access-nwqgq\") pod \"nova-cell1-3cc6-account-create-update-8fxff\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.237577 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87486239-3017-44ed-ba9d-a28541bb2aca-operator-scripts\") pod \"nova-cell1-3cc6-account-create-update-8fxff\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.238350 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87486239-3017-44ed-ba9d-a28541bb2aca-operator-scripts\") pod \"nova-cell1-3cc6-account-create-update-8fxff\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.258974 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwqgq\" (UniqueName: \"kubernetes.io/projected/87486239-3017-44ed-ba9d-a28541bb2aca-kube-api-access-nwqgq\") pod \"nova-cell1-3cc6-account-create-update-8fxff\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: W0227 06:32:57.277183 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-conmon-a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-conmon-a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19.scope: no such file or directory Feb 27 06:32:57 crc kubenswrapper[4725]: W0227 06:32:57.277235 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6906bd_905c_49d9_92d4_3f59b948ad2a.slice/crio-a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19.scope: no such file or directory Feb 27 06:32:57 crc kubenswrapper[4725]: W0227 06:32:57.277254 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-conmon-b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-conmon-b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1.scope: no such file or directory Feb 27 06:32:57 crc kubenswrapper[4725]: W0227 06:32:57.277269 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae62a22_d2ec_4af6_9e82_b7a3aafb9188.slice/crio-b77a92cd1e6fdd79ec32d901878355e3bfda2aebd4940ce50ab7ad6de38f27d1.scope: no such file or directory Feb 27 06:32:57 crc kubenswrapper[4725]: W0227 06:32:57.297914 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e9aa25_5670_466b_92a2_26b711b3ccf4.slice/crio-conmon-fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e9aa25_5670_466b_92a2_26b711b3ccf4.slice/crio-conmon-fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da.scope: no such file or directory Feb 27 06:32:57 crc kubenswrapper[4725]: W0227 06:32:57.297976 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e9aa25_5670_466b_92a2_26b711b3ccf4.slice/crio-fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e9aa25_5670_466b_92a2_26b711b3ccf4.slice/crio-fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da.scope: no such file or directory Feb 27 06:32:57 crc kubenswrapper[4725]: W0227 06:32:57.307221 4725 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e45fa6_7b5e_42ef_ac78_8d13906f7abc.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e45fa6_7b5e_42ef_ac78_8d13906f7abc.slice: no such file or directory Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.338363 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.649938 4725 scope.go:117] "RemoveContainer" containerID="6e97d5c93ea25766734af3f35175af30c69bcdd8660a6bac808b212878cd4555" Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.675823 4725 generic.go:334] "Generic (PLEG): container finished" podID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerID="a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19" exitCode=137 Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.675881 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6906bd-905c-49d9-92d4-3f59b948ad2a","Type":"ContainerDied","Data":"a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19"} Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.682113 4725 generic.go:334] "Generic (PLEG): container finished" podID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerID="a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17" exitCode=0 Feb 27 06:32:57 crc kubenswrapper[4725]: I0227 06:32:57.682159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d8db4dd4-8t58x" event={"ID":"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096","Type":"ContainerDied","Data":"a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17"} Feb 27 06:32:58 crc kubenswrapper[4725]: I0227 06:32:58.001056 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.187:8776/healthcheck\": dial tcp 10.217.0.187:8776: connect: connection refused" Feb 27 06:32:58 crc kubenswrapper[4725]: I0227 06:32:58.702381 4725 generic.go:334] "Generic (PLEG): container finished" podID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerID="ddeb12551e0545cef274b108a5df028e84004c848959f496b5e492599e202081" exitCode=0 Feb 27 06:32:58 crc kubenswrapper[4725]: I0227 06:32:58.702445 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66dd6dba-4ab7-4fa0-88a5-abdccb202492","Type":"ContainerDied","Data":"ddeb12551e0545cef274b108a5df028e84004c848959f496b5e492599e202081"} Feb 27 06:32:58 crc kubenswrapper[4725]: I0227 06:32:58.905594 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:58 crc kubenswrapper[4725]: I0227 06:32:58.909715 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-559f68776c-7cj2d" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.212155 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.279143 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data\") pod \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.279410 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-scripts\") pod \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.279508 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6906bd-905c-49d9-92d4-3f59b948ad2a-logs\") pod \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.279566 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data-custom\") pod \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.279646 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sccbd\" (UniqueName: \"kubernetes.io/projected/3e6906bd-905c-49d9-92d4-3f59b948ad2a-kube-api-access-sccbd\") pod \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.279737 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6906bd-905c-49d9-92d4-3f59b948ad2a-etc-machine-id\") pod \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.279759 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-combined-ca-bundle\") pod \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\" (UID: \"3e6906bd-905c-49d9-92d4-3f59b948ad2a\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.284451 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6906bd-905c-49d9-92d4-3f59b948ad2a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3e6906bd-905c-49d9-92d4-3f59b948ad2a" (UID: "3e6906bd-905c-49d9-92d4-3f59b948ad2a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.285111 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e6906bd-905c-49d9-92d4-3f59b948ad2a-logs" (OuterVolumeSpecName: "logs") pod "3e6906bd-905c-49d9-92d4-3f59b948ad2a" (UID: "3e6906bd-905c-49d9-92d4-3f59b948ad2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.287127 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e6906bd-905c-49d9-92d4-3f59b948ad2a" (UID: "3e6906bd-905c-49d9-92d4-3f59b948ad2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.287496 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6906bd-905c-49d9-92d4-3f59b948ad2a-kube-api-access-sccbd" (OuterVolumeSpecName: "kube-api-access-sccbd") pod "3e6906bd-905c-49d9-92d4-3f59b948ad2a" (UID: "3e6906bd-905c-49d9-92d4-3f59b948ad2a"). InnerVolumeSpecName "kube-api-access-sccbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.295152 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-scripts" (OuterVolumeSpecName: "scripts") pod "3e6906bd-905c-49d9-92d4-3f59b948ad2a" (UID: "3e6906bd-905c-49d9-92d4-3f59b948ad2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.334195 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e6906bd-905c-49d9-92d4-3f59b948ad2a" (UID: "3e6906bd-905c-49d9-92d4-3f59b948ad2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.358623 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.383162 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sccbd\" (UniqueName: \"kubernetes.io/projected/3e6906bd-905c-49d9-92d4-3f59b948ad2a-kube-api-access-sccbd\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.383187 4725 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6906bd-905c-49d9-92d4-3f59b948ad2a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.383196 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.383205 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.383216 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6906bd-905c-49d9-92d4-3f59b948ad2a-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.383223 4725 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.437895 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data" (OuterVolumeSpecName: "config-data") pod "3e6906bd-905c-49d9-92d4-3f59b948ad2a" (UID: "3e6906bd-905c-49d9-92d4-3f59b948ad2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.484888 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-config-data\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.485273 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-httpd-run\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.485324 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.485396 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-combined-ca-bundle\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.485440 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl79w\" (UniqueName: \"kubernetes.io/projected/66dd6dba-4ab7-4fa0-88a5-abdccb202492-kube-api-access-hl79w\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.485502 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-internal-tls-certs\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.485629 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-logs\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.485671 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-scripts\") pod \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\" (UID: \"66dd6dba-4ab7-4fa0-88a5-abdccb202492\") " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.486181 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6906bd-905c-49d9-92d4-3f59b948ad2a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.490146 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.491661 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-logs" (OuterVolumeSpecName: "logs") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.496455 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.519524 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-scripts" (OuterVolumeSpecName: "scripts") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.519554 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66dd6dba-4ab7-4fa0-88a5-abdccb202492-kube-api-access-hl79w" (OuterVolumeSpecName: "kube-api-access-hl79w") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "kube-api-access-hl79w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.573877 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-config-data" (OuterVolumeSpecName: "config-data") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.576553 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.582438 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-m6gzd"] Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.588004 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.588051 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.588073 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.588085 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl79w\" (UniqueName: \"kubernetes.io/projected/66dd6dba-4ab7-4fa0-88a5-abdccb202492-kube-api-access-hl79w\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.588094 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66dd6dba-4ab7-4fa0-88a5-abdccb202492-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.588102 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.588111 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.635182 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3cc6-account-create-update-8fxff"] Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.694404 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "66dd6dba-4ab7-4fa0-88a5-abdccb202492" (UID: "66dd6dba-4ab7-4fa0-88a5-abdccb202492"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.719853 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.733654 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m6gzd" event={"ID":"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a","Type":"ContainerStarted","Data":"f2f58aac03d91a0e831e22a56ecdf129c68ceb9ee2adf0a5ebbd5566f2ac9c32"} Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.740949 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerStarted","Data":"f30dfd51c600b5fb8dd30779305b2953492b61a4f3946d81174aaec1ac137200"} Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.746658 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6c9af008-ad8e-4eaa-b631-543a0ef1bb00","Type":"ContainerStarted","Data":"12b69452ce5d97dabf063de997ff30b38b1a7b3f84a24e958f72c3dbe53af7b4"} Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.750481 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" event={"ID":"87486239-3017-44ed-ba9d-a28541bb2aca","Type":"ContainerStarted","Data":"20d60c1680e5f3c9fdcdfb6ce365782b66f571729b81d3e3a50a23fe5affb806"} Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.759438 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.760460 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6906bd-905c-49d9-92d4-3f59b948ad2a","Type":"ContainerDied","Data":"1d30bde061641191ecaf9f217ebb6e0259dd12e5c562ef082fd117bbc86e7756"} Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.760509 4725 scope.go:117] "RemoveContainer" containerID="a7d3f81a3cace48c341f0b60189596b0094fa712694f6180d56bb550dee28c19" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.781446 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-46fc-account-create-update-sp56g"] Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.783045 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"66dd6dba-4ab7-4fa0-88a5-abdccb202492","Type":"ContainerDied","Data":"84eb2e2e64d82dbee7a2bb41cce66f7440ba036c6a6d6291693f791609b406ef"} Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.783715 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.797700 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66dd6dba-4ab7-4fa0-88a5-abdccb202492-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.802270 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.815132 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mz255"] Feb 27 06:32:59 crc kubenswrapper[4725]: W0227 06:32:59.826830 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c2e80d4_9806_45a7_b10e_91d387331e54.slice/crio-f2044ad31602102743662fe3c8f6f0db1c7e67874ee25d4ceae6433ae3aacd23 WatchSource:0}: Error finding container f2044ad31602102743662fe3c8f6f0db1c7e67874ee25d4ceae6433ae3aacd23: Status 404 returned error can't find the container with id f2044ad31602102743662fe3c8f6f0db1c7e67874ee25d4ceae6433ae3aacd23 Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.827451 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tzhf5"] Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.828410 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.72383648 podStartE2EDuration="14.828394719s" podCreationTimestamp="2026-02-27 06:32:45 +0000 UTC" firstStartedPulling="2026-02-27 06:32:46.782877947 +0000 UTC m=+1345.245498516" lastFinishedPulling="2026-02-27 06:32:58.887436196 +0000 UTC m=+1357.350056755" observedRunningTime="2026-02-27 06:32:59.768925147 +0000 UTC m=+1358.231545716" watchObservedRunningTime="2026-02-27 06:32:59.828394719 +0000 UTC m=+1358.291015288" Feb 27 06:32:59 crc kubenswrapper[4725]: I0227 06:32:59.874495 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fd8c-account-create-update-tn48s"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.006325 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.026931 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.045232 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.065544 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.079670 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: E0227 06:33:00.080532 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-httpd" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.080592 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-httpd" Feb 27 06:33:00 crc kubenswrapper[4725]: E0227 06:33:00.080610 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api-log" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.080635 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api-log" Feb 27 06:33:00 crc kubenswrapper[4725]: E0227 06:33:00.080656 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-log" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.080662 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-log" Feb 27 06:33:00 crc kubenswrapper[4725]: E0227 06:33:00.080675 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.080681 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.081169 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api-log" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.081187 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-log" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.081197 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" containerName="glance-httpd" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.081212 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" containerName="cinder-api" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.083093 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.088429 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.090495 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.091973 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.092081 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.102114 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.103809 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.106045 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.106228 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.107470 4725 scope.go:117] "RemoveContainer" containerID="ab841c713c218d2341f0c0924c42ad2f25622485419c27a892006fdedbf80756" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.117312 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.163800 4725 scope.go:117] "RemoveContainer" containerID="ddeb12551e0545cef274b108a5df028e84004c848959f496b5e492599e202081" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212102 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-config-data\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212140 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212163 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212237 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-scripts\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212262 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819ee261-b129-4874-8f16-5f505d7b3c01-logs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212326 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-config-data-custom\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212350 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-public-tls-certs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212366 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212382 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdblc\" (UniqueName: \"kubernetes.io/projected/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-kube-api-access-xdblc\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212431 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/819ee261-b129-4874-8f16-5f505d7b3c01-etc-machine-id\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212447 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212470 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212497 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212521 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212546 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.212572 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdx9h\" (UniqueName: \"kubernetes.io/projected/819ee261-b129-4874-8f16-5f505d7b3c01-kube-api-access-cdx9h\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.216069 4725 scope.go:117] "RemoveContainer" containerID="2a038dd6688e01b1305a86954662a27fed8e771b340b874444f3b95a6a5be964" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.298756 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6906bd-905c-49d9-92d4-3f59b948ad2a" path="/var/lib/kubelet/pods/3e6906bd-905c-49d9-92d4-3f59b948ad2a/volumes" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.299941 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66dd6dba-4ab7-4fa0-88a5-abdccb202492" path="/var/lib/kubelet/pods/66dd6dba-4ab7-4fa0-88a5-abdccb202492/volumes" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315278 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315405 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-scripts\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315438 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315456 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819ee261-b129-4874-8f16-5f505d7b3c01-logs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315486 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-config-data-custom\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315514 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-public-tls-certs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315531 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdblc\" (UniqueName: \"kubernetes.io/projected/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-kube-api-access-xdblc\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315546 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315566 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/819ee261-b129-4874-8f16-5f505d7b3c01-etc-machine-id\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315582 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315605 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315635 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315658 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315684 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315709 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdx9h\" (UniqueName: \"kubernetes.io/projected/819ee261-b129-4874-8f16-5f505d7b3c01-kube-api-access-cdx9h\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315749 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-config-data\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.315763 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.316655 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.316990 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.317440 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/819ee261-b129-4874-8f16-5f505d7b3c01-logs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.325458 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.326841 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-public-tls-certs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.330490 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.334972 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.341944 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/819ee261-b129-4874-8f16-5f505d7b3c01-etc-machine-id\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.343618 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-config-data-custom\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.345770 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.346540 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.347492 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdblc\" (UniqueName: \"kubernetes.io/projected/41a3d13f-cb8c-42cc-aa8e-12d09fe458f1-kube-api-access-xdblc\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.353394 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.354993 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.363666 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdx9h\" (UniqueName: \"kubernetes.io/projected/819ee261-b129-4874-8f16-5f505d7b3c01-kube-api-access-cdx9h\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.365536 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-config-data\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.369462 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/819ee261-b129-4874-8f16-5f505d7b3c01-scripts\") pod \"cinder-api-0\" (UID: \"819ee261-b129-4874-8f16-5f505d7b3c01\") " pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.404931 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.412641 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1\") " pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.426475 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.443400 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.525640 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jstn\" (UniqueName: \"kubernetes.io/projected/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-kube-api-access-2jstn\") pod \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.525785 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-ovndb-tls-certs\") pod \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.525852 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-httpd-config\") pod \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.525961 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-combined-ca-bundle\") pod \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.526026 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-config\") pod \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\" (UID: \"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096\") " Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.554485 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-kube-api-access-2jstn" (OuterVolumeSpecName: "kube-api-access-2jstn") pod "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" (UID: "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096"). InnerVolumeSpecName "kube-api-access-2jstn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.554524 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" (UID: "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.599588 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-config" (OuterVolumeSpecName: "config") pod "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" (UID: "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.604385 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" (UID: "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.628513 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.628551 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jstn\" (UniqueName: \"kubernetes.io/projected/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-kube-api-access-2jstn\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.628561 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.628569 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.638564 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" (UID: "52add5b7-bfbe-4d4c-ad4c-bf26a3afa096"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.730555 4725 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.816379 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerStarted","Data":"9bd39b1770b60d44a64dd9755922018a1f973ee19fb2f76330c25b903ea16a33"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.821096 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c2e80d4-9806-45a7-b10e-91d387331e54" containerID="a99d01d7467652bb4c93ac11f0301ed5b38259cc6f040509661a9a6e0d7efcaf" exitCode=0 Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.821171 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tzhf5" event={"ID":"6c2e80d4-9806-45a7-b10e-91d387331e54","Type":"ContainerDied","Data":"a99d01d7467652bb4c93ac11f0301ed5b38259cc6f040509661a9a6e0d7efcaf"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.821196 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tzhf5" event={"ID":"6c2e80d4-9806-45a7-b10e-91d387331e54","Type":"ContainerStarted","Data":"f2044ad31602102743662fe3c8f6f0db1c7e67874ee25d4ceae6433ae3aacd23"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.824122 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" event={"ID":"af23271e-c19f-475c-8ff7-51e9bbe4471e","Type":"ContainerStarted","Data":"f3190fa7ccaffee1ae3d41ca52006b2c0f6abfb2097c73f9d59785e19ceb8492"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.824166 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" event={"ID":"af23271e-c19f-475c-8ff7-51e9bbe4471e","Type":"ContainerStarted","Data":"b022aaa59edaad040c13db007b665118435b0e3526549570ba82616dafa395f1"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.836229 4725 generic.go:334] "Generic (PLEG): container finished" podID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerID="e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90" exitCode=0 Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.836364 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d8db4dd4-8t58x" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.836473 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d8db4dd4-8t58x" event={"ID":"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096","Type":"ContainerDied","Data":"e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.836526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d8db4dd4-8t58x" event={"ID":"52add5b7-bfbe-4d4c-ad4c-bf26a3afa096","Type":"ContainerDied","Data":"48f69990b07a5f275825cb0551fd7f201396aca9f56a9558e4af3c98236f6f0a"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.836544 4725 scope.go:117] "RemoveContainer" containerID="a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17" Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.838398 4725 generic.go:334] "Generic (PLEG): container finished" podID="c28e6b4f-6ef4-4e8f-9b40-366064eec781" containerID="03649b7b355ab0d963f6f48aeec023aa86a68a2d053be838270217a1e9d76cf3" exitCode=0 Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.838572 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mz255" event={"ID":"c28e6b4f-6ef4-4e8f-9b40-366064eec781","Type":"ContainerDied","Data":"03649b7b355ab0d963f6f48aeec023aa86a68a2d053be838270217a1e9d76cf3"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.838675 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mz255" event={"ID":"c28e6b4f-6ef4-4e8f-9b40-366064eec781","Type":"ContainerStarted","Data":"d9a52273483b45d37ac966caf22cd7fa9d71bff0b0e42b32d8be39ef27e17df3"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.840858 4725 generic.go:334] "Generic (PLEG): container finished" podID="87486239-3017-44ed-ba9d-a28541bb2aca" containerID="5bf67bb66aa34941d965e0f91231a40281eeb4649cb3a35ac6075aa18783d165" exitCode=0 Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.841026 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" event={"ID":"87486239-3017-44ed-ba9d-a28541bb2aca","Type":"ContainerDied","Data":"5bf67bb66aa34941d965e0f91231a40281eeb4649cb3a35ac6075aa18783d165"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.879879 4725 generic.go:334] "Generic (PLEG): container finished" podID="dfe9b6e7-e6f5-4255-ab87-5c42cc89963a" containerID="4290a0143f88e4f6b0dbfe4ef20b3bb7f79205484d3e75cf06918865c91a35cd" exitCode=0 Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.879985 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m6gzd" event={"ID":"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a","Type":"ContainerDied","Data":"4290a0143f88e4f6b0dbfe4ef20b3bb7f79205484d3e75cf06918865c91a35cd"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.900665 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-46fc-account-create-update-sp56g" event={"ID":"903b7538-3fc0-4580-9bd0-adff6ce3f634","Type":"ContainerStarted","Data":"50dac54b18dbafeee33efb59de4343536383f956d1774fe4d0956c2c621b9884"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.900717 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-46fc-account-create-update-sp56g" event={"ID":"903b7538-3fc0-4580-9bd0-adff6ce3f634","Type":"ContainerStarted","Data":"a126a4605a96334a95bd87526676a03cb18812b26da3a11fa7802bfb3e473500"} Feb 27 06:33:00 crc kubenswrapper[4725]: I0227 06:33:00.906823 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" podStartSLOduration=4.906801812 podStartE2EDuration="4.906801812s" podCreationTimestamp="2026-02-27 06:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:00.865970112 +0000 UTC m=+1359.328590681" watchObservedRunningTime="2026-02-27 06:33:00.906801812 +0000 UTC m=+1359.369422371" Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.012229 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.198714 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 06:33:01 crc kubenswrapper[4725]: W0227 06:33:01.272448 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41a3d13f_cb8c_42cc_aa8e_12d09fe458f1.slice/crio-ff896675a2f5bbb4046e44a47717a8a19d25c5a2969f5d72ce8b87e0c95f24e6 WatchSource:0}: Error finding container ff896675a2f5bbb4046e44a47717a8a19d25c5a2969f5d72ce8b87e0c95f24e6: Status 404 returned error can't find the container with id ff896675a2f5bbb4046e44a47717a8a19d25c5a2969f5d72ce8b87e0c95f24e6 Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.280482 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78d8db4dd4-8t58x"] Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.289877 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78d8db4dd4-8t58x"] Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.307691 4725 scope.go:117] "RemoveContainer" containerID="e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90" Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.331839 4725 scope.go:117] "RemoveContainer" containerID="a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17" Feb 27 06:33:01 crc kubenswrapper[4725]: E0227 06:33:01.337671 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17\": container with ID starting with a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17 not found: ID does not exist" containerID="a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17" Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.337703 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17"} err="failed to get container status \"a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17\": rpc error: code = NotFound desc = could not find container \"a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17\": container with ID starting with a5e840923ef86b10f0c784347a693cd37cf0103de98d484ffed25c8247b11b17 not found: ID does not exist" Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.337725 4725 scope.go:117] "RemoveContainer" containerID="e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90" Feb 27 06:33:01 crc kubenswrapper[4725]: E0227 06:33:01.338652 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90\": container with ID starting with e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90 not found: ID does not exist" containerID="e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90" Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.338670 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90"} err="failed to get container status \"e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90\": rpc error: code = NotFound desc = could not find container \"e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90\": container with ID starting with e44141ab3ca8a6f4d55d8f667c9d24b0d5d40b18f71280a006f02c6734e03c90 not found: ID does not exist" Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.524582 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.525198 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-httpd" containerID="cri-o://6cabf474862b2dc817a7f6161f3736738f19b6db8b3cee4102de28ae88b1fcf0" gracePeriod=30 Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.524819 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-log" containerID="cri-o://4c97ac891aff3f694b1d884181146abb458296e7945a6ee0c74573d4a0679297" gracePeriod=30 Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.916858 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1","Type":"ContainerStarted","Data":"ff896675a2f5bbb4046e44a47717a8a19d25c5a2969f5d72ce8b87e0c95f24e6"} Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.937733 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"819ee261-b129-4874-8f16-5f505d7b3c01","Type":"ContainerStarted","Data":"a3ec698416ee1322f33d97fe45f5680831749e19d9e9a9dd91b5703d59bf6907"} Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.937791 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"819ee261-b129-4874-8f16-5f505d7b3c01","Type":"ContainerStarted","Data":"a2a33b50bf3f0f63341875ff2e7069842d6ae54248070802a2417a67cc5a56f8"} Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.942802 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerStarted","Data":"2ec09f664051c3124b757c1cf9e362d72b55611be7d9c1eaaba3050e36088ece"} Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.944728 4725 generic.go:334] "Generic (PLEG): container finished" podID="903b7538-3fc0-4580-9bd0-adff6ce3f634" containerID="50dac54b18dbafeee33efb59de4343536383f956d1774fe4d0956c2c621b9884" exitCode=0 Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.944765 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-46fc-account-create-update-sp56g" event={"ID":"903b7538-3fc0-4580-9bd0-adff6ce3f634","Type":"ContainerDied","Data":"50dac54b18dbafeee33efb59de4343536383f956d1774fe4d0956c2c621b9884"} Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.947258 4725 generic.go:334] "Generic (PLEG): container finished" podID="af23271e-c19f-475c-8ff7-51e9bbe4471e" containerID="f3190fa7ccaffee1ae3d41ca52006b2c0f6abfb2097c73f9d59785e19ceb8492" exitCode=0 Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.947316 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" event={"ID":"af23271e-c19f-475c-8ff7-51e9bbe4471e","Type":"ContainerDied","Data":"f3190fa7ccaffee1ae3d41ca52006b2c0f6abfb2097c73f9d59785e19ceb8492"} Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.998151 4725 generic.go:334] "Generic (PLEG): container finished" podID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerID="4c97ac891aff3f694b1d884181146abb458296e7945a6ee0c74573d4a0679297" exitCode=143 Feb 27 06:33:01 crc kubenswrapper[4725]: I0227 06:33:01.998328 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a677ff7e-bb0e-4493-897b-c25dab79e22e","Type":"ContainerDied","Data":"4c97ac891aff3f694b1d884181146abb458296e7945a6ee0c74573d4a0679297"} Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.268913 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" path="/var/lib/kubelet/pods/52add5b7-bfbe-4d4c-ad4c-bf26a3afa096/volumes" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.473906 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.487309 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6k5g\" (UniqueName: \"kubernetes.io/projected/6c2e80d4-9806-45a7-b10e-91d387331e54-kube-api-access-d6k5g\") pod \"6c2e80d4-9806-45a7-b10e-91d387331e54\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.487483 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2e80d4-9806-45a7-b10e-91d387331e54-operator-scripts\") pod \"6c2e80d4-9806-45a7-b10e-91d387331e54\" (UID: \"6c2e80d4-9806-45a7-b10e-91d387331e54\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.488318 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c2e80d4-9806-45a7-b10e-91d387331e54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c2e80d4-9806-45a7-b10e-91d387331e54" (UID: "6c2e80d4-9806-45a7-b10e-91d387331e54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.492992 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2e80d4-9806-45a7-b10e-91d387331e54-kube-api-access-d6k5g" (OuterVolumeSpecName: "kube-api-access-d6k5g") pod "6c2e80d4-9806-45a7-b10e-91d387331e54" (UID: "6c2e80d4-9806-45a7-b10e-91d387331e54"). InnerVolumeSpecName "kube-api-access-d6k5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.553833 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.553873 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.553912 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.555176 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28426146ca35fe9273793a5717f9e41fb0368e16b25b6d5b4d504e333b929ead"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.555229 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://28426146ca35fe9273793a5717f9e41fb0368e16b25b6d5b4d504e333b929ead" gracePeriod=600 Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.590461 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6k5g\" (UniqueName: \"kubernetes.io/projected/6c2e80d4-9806-45a7-b10e-91d387331e54-kube-api-access-d6k5g\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.590496 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2e80d4-9806-45a7-b10e-91d387331e54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.876864 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.883878 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.894636 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mz255" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.894901 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.896058 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r5z2\" (UniqueName: \"kubernetes.io/projected/903b7538-3fc0-4580-9bd0-adff6ce3f634-kube-api-access-6r5z2\") pod \"903b7538-3fc0-4580-9bd0-adff6ce3f634\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.896120 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q9gh\" (UniqueName: \"kubernetes.io/projected/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-kube-api-access-8q9gh\") pod \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.896145 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-operator-scripts\") pod \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\" (UID: \"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.896295 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/903b7538-3fc0-4580-9bd0-adff6ce3f634-operator-scripts\") pod \"903b7538-3fc0-4580-9bd0-adff6ce3f634\" (UID: \"903b7538-3fc0-4580-9bd0-adff6ce3f634\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.899690 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfe9b6e7-e6f5-4255-ab87-5c42cc89963a" (UID: "dfe9b6e7-e6f5-4255-ab87-5c42cc89963a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.901457 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903b7538-3fc0-4580-9bd0-adff6ce3f634-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "903b7538-3fc0-4580-9bd0-adff6ce3f634" (UID: "903b7538-3fc0-4580-9bd0-adff6ce3f634"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.908411 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/903b7538-3fc0-4580-9bd0-adff6ce3f634-kube-api-access-6r5z2" (OuterVolumeSpecName: "kube-api-access-6r5z2") pod "903b7538-3fc0-4580-9bd0-adff6ce3f634" (UID: "903b7538-3fc0-4580-9bd0-adff6ce3f634"). InnerVolumeSpecName "kube-api-access-6r5z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.909476 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-kube-api-access-8q9gh" (OuterVolumeSpecName: "kube-api-access-8q9gh") pod "dfe9b6e7-e6f5-4255-ab87-5c42cc89963a" (UID: "dfe9b6e7-e6f5-4255-ab87-5c42cc89963a"). InnerVolumeSpecName "kube-api-access-8q9gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.998128 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87486239-3017-44ed-ba9d-a28541bb2aca-operator-scripts\") pod \"87486239-3017-44ed-ba9d-a28541bb2aca\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.998884 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwqgq\" (UniqueName: \"kubernetes.io/projected/87486239-3017-44ed-ba9d-a28541bb2aca-kube-api-access-nwqgq\") pod \"87486239-3017-44ed-ba9d-a28541bb2aca\" (UID: \"87486239-3017-44ed-ba9d-a28541bb2aca\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.999216 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbns6\" (UniqueName: \"kubernetes.io/projected/c28e6b4f-6ef4-4e8f-9b40-366064eec781-kube-api-access-dbns6\") pod \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.999256 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28e6b4f-6ef4-4e8f-9b40-366064eec781-operator-scripts\") pod \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\" (UID: \"c28e6b4f-6ef4-4e8f-9b40-366064eec781\") " Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.999758 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/903b7538-3fc0-4580-9bd0-adff6ce3f634-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.999770 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r5z2\" (UniqueName: \"kubernetes.io/projected/903b7538-3fc0-4580-9bd0-adff6ce3f634-kube-api-access-6r5z2\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.999781 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q9gh\" (UniqueName: \"kubernetes.io/projected/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-kube-api-access-8q9gh\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:02 crc kubenswrapper[4725]: I0227 06:33:02.999791 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.000333 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87486239-3017-44ed-ba9d-a28541bb2aca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87486239-3017-44ed-ba9d-a28541bb2aca" (UID: "87486239-3017-44ed-ba9d-a28541bb2aca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.000363 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28e6b4f-6ef4-4e8f-9b40-366064eec781-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c28e6b4f-6ef4-4e8f-9b40-366064eec781" (UID: "c28e6b4f-6ef4-4e8f-9b40-366064eec781"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.005431 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28e6b4f-6ef4-4e8f-9b40-366064eec781-kube-api-access-dbns6" (OuterVolumeSpecName: "kube-api-access-dbns6") pod "c28e6b4f-6ef4-4e8f-9b40-366064eec781" (UID: "c28e6b4f-6ef4-4e8f-9b40-366064eec781"). InnerVolumeSpecName "kube-api-access-dbns6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.005901 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87486239-3017-44ed-ba9d-a28541bb2aca-kube-api-access-nwqgq" (OuterVolumeSpecName: "kube-api-access-nwqgq") pod "87486239-3017-44ed-ba9d-a28541bb2aca" (UID: "87486239-3017-44ed-ba9d-a28541bb2aca"). InnerVolumeSpecName "kube-api-access-nwqgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.020422 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" event={"ID":"87486239-3017-44ed-ba9d-a28541bb2aca","Type":"ContainerDied","Data":"20d60c1680e5f3c9fdcdfb6ce365782b66f571729b81d3e3a50a23fe5affb806"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.020462 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20d60c1680e5f3c9fdcdfb6ce365782b66f571729b81d3e3a50a23fe5affb806" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.020512 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3cc6-account-create-update-8fxff" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.032742 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mz255" event={"ID":"c28e6b4f-6ef4-4e8f-9b40-366064eec781","Type":"ContainerDied","Data":"d9a52273483b45d37ac966caf22cd7fa9d71bff0b0e42b32d8be39ef27e17df3"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.032777 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a52273483b45d37ac966caf22cd7fa9d71bff0b0e42b32d8be39ef27e17df3" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.032780 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mz255" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.039566 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.065832 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="28426146ca35fe9273793a5717f9e41fb0368e16b25b6d5b4d504e333b929ead" exitCode=0 Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.065911 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"28426146ca35fe9273793a5717f9e41fb0368e16b25b6d5b4d504e333b929ead"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.065946 4725 scope.go:117] "RemoveContainer" containerID="861186ed7f2e4b7df76123b88a35b60ba94275897d0a13a296d2198ea2a7a166" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.081958 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-46fc-account-create-update-sp56g" event={"ID":"903b7538-3fc0-4580-9bd0-adff6ce3f634","Type":"ContainerDied","Data":"a126a4605a96334a95bd87526676a03cb18812b26da3a11fa7802bfb3e473500"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.082123 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a126a4605a96334a95bd87526676a03cb18812b26da3a11fa7802bfb3e473500" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.082332 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-46fc-account-create-update-sp56g" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.096790 4725 generic.go:334] "Generic (PLEG): container finished" podID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerID="6cabf474862b2dc817a7f6161f3736738f19b6db8b3cee4102de28ae88b1fcf0" exitCode=0 Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.096847 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a677ff7e-bb0e-4493-897b-c25dab79e22e","Type":"ContainerDied","Data":"6cabf474862b2dc817a7f6161f3736738f19b6db8b3cee4102de28ae88b1fcf0"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.097062 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.109745 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbns6\" (UniqueName: \"kubernetes.io/projected/c28e6b4f-6ef4-4e8f-9b40-366064eec781-kube-api-access-dbns6\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.109774 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28e6b4f-6ef4-4e8f-9b40-366064eec781-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.109789 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87486239-3017-44ed-ba9d-a28541bb2aca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.109801 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwqgq\" (UniqueName: \"kubernetes.io/projected/87486239-3017-44ed-ba9d-a28541bb2aca-kube-api-access-nwqgq\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.110219 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1","Type":"ContainerStarted","Data":"2f08f8fe344209cd0625e8e3a85ad4690eafadab71a30f20ad7445f8455b9d88"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.116652 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.108279398 podStartE2EDuration="4.108279398s" podCreationTimestamp="2026-02-27 06:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:03.082743169 +0000 UTC m=+1361.545363748" watchObservedRunningTime="2026-02-27 06:33:03.108279398 +0000 UTC m=+1361.570899967" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.125627 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tzhf5" event={"ID":"6c2e80d4-9806-45a7-b10e-91d387331e54","Type":"ContainerDied","Data":"f2044ad31602102743662fe3c8f6f0db1c7e67874ee25d4ceae6433ae3aacd23"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.125680 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2044ad31602102743662fe3c8f6f0db1c7e67874ee25d4ceae6433ae3aacd23" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.125764 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tzhf5" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.153505 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m6gzd" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.153551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m6gzd" event={"ID":"dfe9b6e7-e6f5-4255-ab87-5c42cc89963a","Type":"ContainerDied","Data":"f2f58aac03d91a0e831e22a56ecdf129c68ceb9ee2adf0a5ebbd5566f2ac9c32"} Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.153585 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2f58aac03d91a0e831e22a56ecdf129c68ceb9ee2adf0a5ebbd5566f2ac9c32" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213193 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-combined-ca-bundle\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213580 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42r6d\" (UniqueName: \"kubernetes.io/projected/a677ff7e-bb0e-4493-897b-c25dab79e22e-kube-api-access-42r6d\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213642 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213663 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-public-tls-certs\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213740 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-httpd-run\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213757 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-config-data\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213787 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-scripts\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.213815 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-logs\") pod \"a677ff7e-bb0e-4493-897b-c25dab79e22e\" (UID: \"a677ff7e-bb0e-4493-897b-c25dab79e22e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.214441 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-logs" (OuterVolumeSpecName: "logs") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.214675 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.222653 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-scripts" (OuterVolumeSpecName: "scripts") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.228409 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.228656 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a677ff7e-bb0e-4493-897b-c25dab79e22e-kube-api-access-42r6d" (OuterVolumeSpecName: "kube-api-access-42r6d") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "kube-api-access-42r6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.257792 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.294437 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-config-data" (OuterVolumeSpecName: "config-data") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.315895 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.315925 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.315935 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.315946 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42r6d\" (UniqueName: \"kubernetes.io/projected/a677ff7e-bb0e-4493-897b-c25dab79e22e-kube-api-access-42r6d\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.315966 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.315976 4725 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a677ff7e-bb0e-4493-897b-c25dab79e22e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.315984 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.322139 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a677ff7e-bb0e-4493-897b-c25dab79e22e" (UID: "a677ff7e-bb0e-4493-897b-c25dab79e22e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.335583 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.418586 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.418631 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a677ff7e-bb0e-4493-897b-c25dab79e22e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.526479 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.622938 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af23271e-c19f-475c-8ff7-51e9bbe4471e-operator-scripts\") pod \"af23271e-c19f-475c-8ff7-51e9bbe4471e\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.623272 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4fk9\" (UniqueName: \"kubernetes.io/projected/af23271e-c19f-475c-8ff7-51e9bbe4471e-kube-api-access-d4fk9\") pod \"af23271e-c19f-475c-8ff7-51e9bbe4471e\" (UID: \"af23271e-c19f-475c-8ff7-51e9bbe4471e\") " Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.623728 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af23271e-c19f-475c-8ff7-51e9bbe4471e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af23271e-c19f-475c-8ff7-51e9bbe4471e" (UID: "af23271e-c19f-475c-8ff7-51e9bbe4471e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.624153 4725 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af23271e-c19f-475c-8ff7-51e9bbe4471e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.629405 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af23271e-c19f-475c-8ff7-51e9bbe4471e-kube-api-access-d4fk9" (OuterVolumeSpecName: "kube-api-access-d4fk9") pod "af23271e-c19f-475c-8ff7-51e9bbe4471e" (UID: "af23271e-c19f-475c-8ff7-51e9bbe4471e"). InnerVolumeSpecName "kube-api-access-d4fk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:03 crc kubenswrapper[4725]: I0227 06:33:03.726643 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4fk9\" (UniqueName: \"kubernetes.io/projected/af23271e-c19f-475c-8ff7-51e9bbe4471e-kube-api-access-d4fk9\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.165213 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"819ee261-b129-4874-8f16-5f505d7b3c01","Type":"ContainerStarted","Data":"cbe6c0bd031a077d17ae596bd9ad945ae95420fbe019ae86985893d41c198bcc"} Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.167107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerStarted","Data":"bae54cabb3c0f1aafa9bf5eacba839fc7dc40db4fd51e32db2c4ccd544ff83a0"} Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.167260 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-central-agent" containerID="cri-o://f30dfd51c600b5fb8dd30779305b2953492b61a4f3946d81174aaec1ac137200" gracePeriod=30 Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.167360 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.167696 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="proxy-httpd" containerID="cri-o://bae54cabb3c0f1aafa9bf5eacba839fc7dc40db4fd51e32db2c4ccd544ff83a0" gracePeriod=30 Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.167756 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="sg-core" containerID="cri-o://2ec09f664051c3124b757c1cf9e362d72b55611be7d9c1eaaba3050e36088ece" gracePeriod=30 Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.167796 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-notification-agent" containerID="cri-o://9bd39b1770b60d44a64dd9755922018a1f973ee19fb2f76330c25b903ea16a33" gracePeriod=30 Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.173469 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d"} Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.175144 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" event={"ID":"af23271e-c19f-475c-8ff7-51e9bbe4471e","Type":"ContainerDied","Data":"b022aaa59edaad040c13db007b665118435b0e3526549570ba82616dafa395f1"} Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.175168 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b022aaa59edaad040c13db007b665118435b0e3526549570ba82616dafa395f1" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.175220 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fd8c-account-create-update-tn48s" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.180200 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.180199 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a677ff7e-bb0e-4493-897b-c25dab79e22e","Type":"ContainerDied","Data":"83f34c871821d0534c30f1afbd92a2903532c8c897229713baecacb2f321b7da"} Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.180325 4725 scope.go:117] "RemoveContainer" containerID="6cabf474862b2dc817a7f6161f3736738f19b6db8b3cee4102de28ae88b1fcf0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.185444 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41a3d13f-cb8c-42cc-aa8e-12d09fe458f1","Type":"ContainerStarted","Data":"68774b9c67d7a6e23a9b93f7fc4fc9e0e66d5ea48621ae9c6da7e3a0f4ce49aa"} Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.200365 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.319934856 podStartE2EDuration="14.200347512s" podCreationTimestamp="2026-02-27 06:32:50 +0000 UTC" firstStartedPulling="2026-02-27 06:32:51.576341459 +0000 UTC m=+1350.038962028" lastFinishedPulling="2026-02-27 06:33:03.456754115 +0000 UTC m=+1361.919374684" observedRunningTime="2026-02-27 06:33:04.196085082 +0000 UTC m=+1362.658705661" watchObservedRunningTime="2026-02-27 06:33:04.200347512 +0000 UTC m=+1362.662968081" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.224421 4725 scope.go:117] "RemoveContainer" containerID="4c97ac891aff3f694b1d884181146abb458296e7945a6ee0c74573d4a0679297" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.230519 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.230500101 podStartE2EDuration="4.230500101s" podCreationTimestamp="2026-02-27 06:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:04.219279115 +0000 UTC m=+1362.681899684" watchObservedRunningTime="2026-02-27 06:33:04.230500101 +0000 UTC m=+1362.693120680" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.263458 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.267029 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278118 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278599 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af23271e-c19f-475c-8ff7-51e9bbe4471e" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278634 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="af23271e-c19f-475c-8ff7-51e9bbe4471e" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278648 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-log" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278655 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-log" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278674 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-httpd" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278701 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-httpd" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278716 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-api" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278722 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-api" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278734 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87486239-3017-44ed-ba9d-a28541bb2aca" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278739 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="87486239-3017-44ed-ba9d-a28541bb2aca" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278749 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28e6b4f-6ef4-4e8f-9b40-366064eec781" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278756 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28e6b4f-6ef4-4e8f-9b40-366064eec781" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278784 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2e80d4-9806-45a7-b10e-91d387331e54" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278791 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2e80d4-9806-45a7-b10e-91d387331e54" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278801 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903b7538-3fc0-4580-9bd0-adff6ce3f634" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278807 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="903b7538-3fc0-4580-9bd0-adff6ce3f634" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278832 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-httpd" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278838 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-httpd" Feb 27 06:33:04 crc kubenswrapper[4725]: E0227 06:33:04.278872 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe9b6e7-e6f5-4255-ab87-5c42cc89963a" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.278878 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe9b6e7-e6f5-4255-ab87-5c42cc89963a" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279125 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="903b7538-3fc0-4580-9bd0-adff6ce3f634" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279143 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-httpd" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279154 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" containerName="glance-log" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279179 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2e80d4-9806-45a7-b10e-91d387331e54" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279189 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="af23271e-c19f-475c-8ff7-51e9bbe4471e" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279209 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-api" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279219 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="87486239-3017-44ed-ba9d-a28541bb2aca" containerName="mariadb-account-create-update" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279229 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe9b6e7-e6f5-4255-ab87-5c42cc89963a" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279256 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="52add5b7-bfbe-4d4c-ad4c-bf26a3afa096" containerName="neutron-httpd" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.279267 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28e6b4f-6ef4-4e8f-9b40-366064eec781" containerName="mariadb-database-create" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.281591 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.290416 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.290607 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.291750 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.339935 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.340119 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.340210 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.340236 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.340353 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb52q\" (UniqueName: \"kubernetes.io/projected/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-kube-api-access-jb52q\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.340448 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-logs\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.340558 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.340644 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442270 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-logs\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442338 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442365 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442421 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442467 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442489 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442507 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.442538 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb52q\" (UniqueName: \"kubernetes.io/projected/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-kube-api-access-jb52q\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.443107 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-logs\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.443333 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.444315 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.448819 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.449673 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.450417 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.451691 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.463326 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb52q\" (UniqueName: \"kubernetes.io/projected/4b3b10b4-8a3a-492c-97ce-9ae74040d8ae-kube-api-access-jb52q\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.518167 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae\") " pod="openstack/glance-default-external-api-0" Feb 27 06:33:04 crc kubenswrapper[4725]: I0227 06:33:04.660898 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 06:33:05 crc kubenswrapper[4725]: I0227 06:33:05.182198 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 06:33:05 crc kubenswrapper[4725]: W0227 06:33:05.197927 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b3b10b4_8a3a_492c_97ce_9ae74040d8ae.slice/crio-c211b80cf432addf6fed7a6c259eeac8fa8ac18693e2fb4976fd055242a7caa0 WatchSource:0}: Error finding container c211b80cf432addf6fed7a6c259eeac8fa8ac18693e2fb4976fd055242a7caa0: Status 404 returned error can't find the container with id c211b80cf432addf6fed7a6c259eeac8fa8ac18693e2fb4976fd055242a7caa0 Feb 27 06:33:05 crc kubenswrapper[4725]: I0227 06:33:05.198328 4725 generic.go:334] "Generic (PLEG): container finished" podID="9359342b-953e-4419-baff-b26bab3404c4" containerID="bae54cabb3c0f1aafa9bf5eacba839fc7dc40db4fd51e32db2c4ccd544ff83a0" exitCode=0 Feb 27 06:33:05 crc kubenswrapper[4725]: I0227 06:33:05.198366 4725 generic.go:334] "Generic (PLEG): container finished" podID="9359342b-953e-4419-baff-b26bab3404c4" containerID="2ec09f664051c3124b757c1cf9e362d72b55611be7d9c1eaaba3050e36088ece" exitCode=2 Feb 27 06:33:05 crc kubenswrapper[4725]: I0227 06:33:05.198382 4725 generic.go:334] "Generic (PLEG): container finished" podID="9359342b-953e-4419-baff-b26bab3404c4" containerID="9bd39b1770b60d44a64dd9755922018a1f973ee19fb2f76330c25b903ea16a33" exitCode=0 Feb 27 06:33:05 crc kubenswrapper[4725]: I0227 06:33:05.198421 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerDied","Data":"bae54cabb3c0f1aafa9bf5eacba839fc7dc40db4fd51e32db2c4ccd544ff83a0"} Feb 27 06:33:05 crc kubenswrapper[4725]: I0227 06:33:05.198476 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerDied","Data":"2ec09f664051c3124b757c1cf9e362d72b55611be7d9c1eaaba3050e36088ece"} Feb 27 06:33:05 crc kubenswrapper[4725]: I0227 06:33:05.198497 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerDied","Data":"9bd39b1770b60d44a64dd9755922018a1f973ee19fb2f76330c25b903ea16a33"} Feb 27 06:33:06 crc kubenswrapper[4725]: I0227 06:33:06.214749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae","Type":"ContainerStarted","Data":"7c5153208e712fac6800c524db9888a9fd1feebec86ce35ba51cb8c2f2d53fd2"} Feb 27 06:33:06 crc kubenswrapper[4725]: I0227 06:33:06.215410 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae","Type":"ContainerStarted","Data":"c211b80cf432addf6fed7a6c259eeac8fa8ac18693e2fb4976fd055242a7caa0"} Feb 27 06:33:06 crc kubenswrapper[4725]: I0227 06:33:06.252395 4725 scope.go:117] "RemoveContainer" containerID="fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da" Feb 27 06:33:06 crc kubenswrapper[4725]: I0227 06:33:06.279326 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a677ff7e-bb0e-4493-897b-c25dab79e22e" path="/var/lib/kubelet/pods/a677ff7e-bb0e-4493-897b-c25dab79e22e/volumes" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.059848 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kjl9w"] Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.061667 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.063664 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.063906 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.064034 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zzbqh" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.071767 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kjl9w"] Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.103372 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.103505 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzvdl\" (UniqueName: \"kubernetes.io/projected/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-kube-api-access-xzvdl\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.103618 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-scripts\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.103749 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-config-data\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.206064 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-scripts\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.206176 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-config-data\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.206364 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.206416 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzvdl\" (UniqueName: \"kubernetes.io/projected/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-kube-api-access-xzvdl\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.211464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.214679 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-scripts\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.215033 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-config-data\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.224857 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerStarted","Data":"b5a0eb19b930d1dba5f2c039111c6651bc5348f3b6040a2539df5e5bf88e1380"} Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.226773 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b3b10b4-8a3a-492c-97ce-9ae74040d8ae","Type":"ContainerStarted","Data":"f3a771654dafe23a581cd0aa89ecbd5873761673e54b88b48803f80d33f56bf8"} Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.233257 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzvdl\" (UniqueName: \"kubernetes.io/projected/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-kube-api-access-xzvdl\") pod \"nova-cell0-conductor-db-sync-kjl9w\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.330413 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.368928 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.36890929 podStartE2EDuration="3.36890929s" podCreationTimestamp="2026-02-27 06:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:07.281553651 +0000 UTC m=+1365.744174220" watchObservedRunningTime="2026-02-27 06:33:07.36890929 +0000 UTC m=+1365.831529859" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.391449 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.397844 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-745fdc9fb8-jhz6h" Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.536116 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-774b97d54d-8xt86"] Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.537085 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-774b97d54d-8xt86" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-api" containerID="cri-o://cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9" gracePeriod=30 Feb 27 06:33:07 crc kubenswrapper[4725]: I0227 06:33:07.537357 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-774b97d54d-8xt86" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-log" containerID="cri-o://cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277" gracePeriod=30 Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.186552 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kjl9w"] Feb 27 06:33:08 crc kubenswrapper[4725]: W0227 06:33:08.224494 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56c5dc31_7f88_4c7a_9f3a_e8dbfe4b7010.slice/crio-cdb92369ded4a016e124204d83dc05ccdf72cd314b0d8cf720972762d2d598ce WatchSource:0}: Error finding container cdb92369ded4a016e124204d83dc05ccdf72cd314b0d8cf720972762d2d598ce: Status 404 returned error can't find the container with id cdb92369ded4a016e124204d83dc05ccdf72cd314b0d8cf720972762d2d598ce Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.247524 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" event={"ID":"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010","Type":"ContainerStarted","Data":"cdb92369ded4a016e124204d83dc05ccdf72cd314b0d8cf720972762d2d598ce"} Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.254567 4725 generic.go:334] "Generic (PLEG): container finished" podID="768d31aa-5226-4119-b1d2-f66786c695f1" containerID="cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277" exitCode=143 Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.273395 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774b97d54d-8xt86" event={"ID":"768d31aa-5226-4119-b1d2-f66786c695f1","Type":"ContainerDied","Data":"cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277"} Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.279135 4725 generic.go:334] "Generic (PLEG): container finished" podID="9359342b-953e-4419-baff-b26bab3404c4" containerID="f30dfd51c600b5fb8dd30779305b2953492b61a4f3946d81174aaec1ac137200" exitCode=0 Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.280012 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerDied","Data":"f30dfd51c600b5fb8dd30779305b2953492b61a4f3946d81174aaec1ac137200"} Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.585114 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.740003 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-config-data\") pod \"9359342b-953e-4419-baff-b26bab3404c4\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.740414 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-log-httpd\") pod \"9359342b-953e-4419-baff-b26bab3404c4\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.740522 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-combined-ca-bundle\") pod \"9359342b-953e-4419-baff-b26bab3404c4\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.740554 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-scripts\") pod \"9359342b-953e-4419-baff-b26bab3404c4\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.740732 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7njkx\" (UniqueName: \"kubernetes.io/projected/9359342b-953e-4419-baff-b26bab3404c4-kube-api-access-7njkx\") pod \"9359342b-953e-4419-baff-b26bab3404c4\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.740788 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-sg-core-conf-yaml\") pod \"9359342b-953e-4419-baff-b26bab3404c4\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.740869 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-run-httpd\") pod \"9359342b-953e-4419-baff-b26bab3404c4\" (UID: \"9359342b-953e-4419-baff-b26bab3404c4\") " Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.741028 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9359342b-953e-4419-baff-b26bab3404c4" (UID: "9359342b-953e-4419-baff-b26bab3404c4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.741277 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9359342b-953e-4419-baff-b26bab3404c4" (UID: "9359342b-953e-4419-baff-b26bab3404c4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.741709 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.741729 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9359342b-953e-4419-baff-b26bab3404c4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.760581 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9359342b-953e-4419-baff-b26bab3404c4-kube-api-access-7njkx" (OuterVolumeSpecName: "kube-api-access-7njkx") pod "9359342b-953e-4419-baff-b26bab3404c4" (UID: "9359342b-953e-4419-baff-b26bab3404c4"). InnerVolumeSpecName "kube-api-access-7njkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.770719 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-scripts" (OuterVolumeSpecName: "scripts") pod "9359342b-953e-4419-baff-b26bab3404c4" (UID: "9359342b-953e-4419-baff-b26bab3404c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.817047 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9359342b-953e-4419-baff-b26bab3404c4" (UID: "9359342b-953e-4419-baff-b26bab3404c4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.833642 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9359342b-953e-4419-baff-b26bab3404c4" (UID: "9359342b-953e-4419-baff-b26bab3404c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.843965 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7njkx\" (UniqueName: \"kubernetes.io/projected/9359342b-953e-4419-baff-b26bab3404c4-kube-api-access-7njkx\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.843997 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.844005 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.844013 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.882552 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-config-data" (OuterVolumeSpecName: "config-data") pod "9359342b-953e-4419-baff-b26bab3404c4" (UID: "9359342b-953e-4419-baff-b26bab3404c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:08 crc kubenswrapper[4725]: I0227 06:33:08.945017 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9359342b-953e-4419-baff-b26bab3404c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.080213 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.249913 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-internal-tls-certs\") pod \"768d31aa-5226-4119-b1d2-f66786c695f1\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.249992 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-public-tls-certs\") pod \"768d31aa-5226-4119-b1d2-f66786c695f1\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.250099 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn74s\" (UniqueName: \"kubernetes.io/projected/768d31aa-5226-4119-b1d2-f66786c695f1-kube-api-access-wn74s\") pod \"768d31aa-5226-4119-b1d2-f66786c695f1\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.250134 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768d31aa-5226-4119-b1d2-f66786c695f1-logs\") pod \"768d31aa-5226-4119-b1d2-f66786c695f1\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.250180 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-combined-ca-bundle\") pod \"768d31aa-5226-4119-b1d2-f66786c695f1\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.250215 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-scripts\") pod \"768d31aa-5226-4119-b1d2-f66786c695f1\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.250237 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-config-data\") pod \"768d31aa-5226-4119-b1d2-f66786c695f1\" (UID: \"768d31aa-5226-4119-b1d2-f66786c695f1\") " Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.252079 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/768d31aa-5226-4119-b1d2-f66786c695f1-logs" (OuterVolumeSpecName: "logs") pod "768d31aa-5226-4119-b1d2-f66786c695f1" (UID: "768d31aa-5226-4119-b1d2-f66786c695f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.261489 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768d31aa-5226-4119-b1d2-f66786c695f1-kube-api-access-wn74s" (OuterVolumeSpecName: "kube-api-access-wn74s") pod "768d31aa-5226-4119-b1d2-f66786c695f1" (UID: "768d31aa-5226-4119-b1d2-f66786c695f1"). InnerVolumeSpecName "kube-api-access-wn74s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.262526 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-scripts" (OuterVolumeSpecName: "scripts") pod "768d31aa-5226-4119-b1d2-f66786c695f1" (UID: "768d31aa-5226-4119-b1d2-f66786c695f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.308328 4725 generic.go:334] "Generic (PLEG): container finished" podID="768d31aa-5226-4119-b1d2-f66786c695f1" containerID="cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9" exitCode=0 Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.308491 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774b97d54d-8xt86" event={"ID":"768d31aa-5226-4119-b1d2-f66786c695f1","Type":"ContainerDied","Data":"cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9"} Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.308559 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774b97d54d-8xt86" event={"ID":"768d31aa-5226-4119-b1d2-f66786c695f1","Type":"ContainerDied","Data":"8d250688552b6c562756bc0ef1ab465b89bad976968887cb0cd4b2b350054400"} Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.308582 4725 scope.go:117] "RemoveContainer" containerID="cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.308802 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774b97d54d-8xt86" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.323703 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "768d31aa-5226-4119-b1d2-f66786c695f1" (UID: "768d31aa-5226-4119-b1d2-f66786c695f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.327746 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9359342b-953e-4419-baff-b26bab3404c4","Type":"ContainerDied","Data":"c92e66415e2a2886ece0a05d319ca89bdd74706066395807dd1c462fe0b74731"} Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.327871 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.337457 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-config-data" (OuterVolumeSpecName: "config-data") pod "768d31aa-5226-4119-b1d2-f66786c695f1" (UID: "768d31aa-5226-4119-b1d2-f66786c695f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.352728 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn74s\" (UniqueName: \"kubernetes.io/projected/768d31aa-5226-4119-b1d2-f66786c695f1-kube-api-access-wn74s\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.352763 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768d31aa-5226-4119-b1d2-f66786c695f1-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.352776 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.352787 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.352799 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.379767 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "768d31aa-5226-4119-b1d2-f66786c695f1" (UID: "768d31aa-5226-4119-b1d2-f66786c695f1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.428333 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "768d31aa-5226-4119-b1d2-f66786c695f1" (UID: "768d31aa-5226-4119-b1d2-f66786c695f1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.454973 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.455030 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/768d31aa-5226-4119-b1d2-f66786c695f1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.525028 4725 scope.go:117] "RemoveContainer" containerID="cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.525251 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.541952 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.591379 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.592456 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-log" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592475 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-log" Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.592491 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="sg-core" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592498 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="sg-core" Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.592507 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="proxy-httpd" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592515 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="proxy-httpd" Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.592534 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-notification-agent" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592541 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-notification-agent" Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.592559 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-api" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592567 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-api" Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.592595 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-central-agent" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592603 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-central-agent" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592975 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-log" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.592998 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-notification-agent" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.593024 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="ceilometer-central-agent" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.593048 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="proxy-httpd" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.593058 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9359342b-953e-4419-baff-b26bab3404c4" containerName="sg-core" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.593071 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" containerName="placement-api" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.598535 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.602554 4725 scope.go:117] "RemoveContainer" containerID="cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.603754 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.603994 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.604658 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9\": container with ID starting with cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9 not found: ID does not exist" containerID="cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.604698 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9"} err="failed to get container status \"cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9\": rpc error: code = NotFound desc = could not find container \"cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9\": container with ID starting with cfe41b4dfa738a0772b85aef90dd294fe773a7a92c04fa0e25520ecf5cec54c9 not found: ID does not exist" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.604726 4725 scope.go:117] "RemoveContainer" containerID="cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277" Feb 27 06:33:09 crc kubenswrapper[4725]: E0227 06:33:09.610029 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277\": container with ID starting with cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277 not found: ID does not exist" containerID="cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.610075 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277"} err="failed to get container status \"cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277\": rpc error: code = NotFound desc = could not find container \"cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277\": container with ID starting with cc161aba5cbc0fedca535c8adf2f69b2551fe8e70e5f27a9407d35d5d1ca7277 not found: ID does not exist" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.610102 4725 scope.go:117] "RemoveContainer" containerID="bae54cabb3c0f1aafa9bf5eacba839fc7dc40db4fd51e32db2c4ccd544ff83a0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.620298 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.682625 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-774b97d54d-8xt86"] Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.683342 4725 scope.go:117] "RemoveContainer" containerID="2ec09f664051c3124b757c1cf9e362d72b55611be7d9c1eaaba3050e36088ece" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.694824 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-774b97d54d-8xt86"] Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.700698 4725 scope.go:117] "RemoveContainer" containerID="9bd39b1770b60d44a64dd9755922018a1f973ee19fb2f76330c25b903ea16a33" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.721480 4725 scope.go:117] "RemoveContainer" containerID="f30dfd51c600b5fb8dd30779305b2953492b61a4f3946d81174aaec1ac137200" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.771035 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-log-httpd\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.771094 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-config-data\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.771129 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.771169 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.771213 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-scripts\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.771238 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-run-httpd\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.771280 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88h8\" (UniqueName: \"kubernetes.io/projected/69224a92-7871-4eed-a4e2-610744faeb6b-kube-api-access-t88h8\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.873246 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-scripts\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.873399 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-run-httpd\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.873467 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t88h8\" (UniqueName: \"kubernetes.io/projected/69224a92-7871-4eed-a4e2-610744faeb6b-kube-api-access-t88h8\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.873596 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-log-httpd\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.873621 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-config-data\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.873665 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.873719 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.875477 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-log-httpd\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.875917 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-run-httpd\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.880255 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-scripts\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.882953 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.883015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.883790 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-config-data\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.899047 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88h8\" (UniqueName: \"kubernetes.io/projected/69224a92-7871-4eed-a4e2-610744faeb6b-kube-api-access-t88h8\") pod \"ceilometer-0\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " pod="openstack/ceilometer-0" Feb 27 06:33:09 crc kubenswrapper[4725]: I0227 06:33:09.948147 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.180933 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.267016 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768d31aa-5226-4119-b1d2-f66786c695f1" path="/var/lib/kubelet/pods/768d31aa-5226-4119-b1d2-f66786c695f1/volumes" Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.268085 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9359342b-953e-4419-baff-b26bab3404c4" path="/var/lib/kubelet/pods/9359342b-953e-4419-baff-b26bab3404c4/volumes" Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.444279 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.444346 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.470250 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:10 crc kubenswrapper[4725]: W0227 06:33:10.504250 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69224a92_7871_4eed_a4e2_610744faeb6b.slice/crio-6495ea4c7f2f1c206ffef0d2db585ce1a989b11e1a12ca74fd579f11aa6fa892 WatchSource:0}: Error finding container 6495ea4c7f2f1c206ffef0d2db585ce1a989b11e1a12ca74fd579f11aa6fa892: Status 404 returned error can't find the container with id 6495ea4c7f2f1c206ffef0d2db585ce1a989b11e1a12ca74fd579f11aa6fa892 Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.600344 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:10 crc kubenswrapper[4725]: I0227 06:33:10.624329 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:11 crc kubenswrapper[4725]: I0227 06:33:11.366179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerStarted","Data":"e77b98329f194fb8e477c189e88b09e923caa8ea80e1bfb4673612c0781fb805"} Feb 27 06:33:11 crc kubenswrapper[4725]: I0227 06:33:11.366619 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerStarted","Data":"53281335f043dd76727011a00cb81e73631a16fd90be0c27b8ba1097c6c18bc3"} Feb 27 06:33:11 crc kubenswrapper[4725]: I0227 06:33:11.367708 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:11 crc kubenswrapper[4725]: I0227 06:33:11.367744 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:11 crc kubenswrapper[4725]: I0227 06:33:11.367757 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerStarted","Data":"6495ea4c7f2f1c206ffef0d2db585ce1a989b11e1a12ca74fd579f11aa6fa892"} Feb 27 06:33:12 crc kubenswrapper[4725]: I0227 06:33:12.375545 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerStarted","Data":"28d6e194cd4a4cc3ddce257a1ccf8951cd0897ae33c9fcb80992466468f0cb05"} Feb 27 06:33:12 crc kubenswrapper[4725]: I0227 06:33:12.495533 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:12 crc kubenswrapper[4725]: I0227 06:33:12.562469 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:12 crc kubenswrapper[4725]: I0227 06:33:12.921122 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 27 06:33:13 crc kubenswrapper[4725]: I0227 06:33:13.383510 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:13 crc kubenswrapper[4725]: I0227 06:33:13.383818 4725 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 06:33:13 crc kubenswrapper[4725]: I0227 06:33:13.384024 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:13 crc kubenswrapper[4725]: I0227 06:33:13.388686 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 06:33:13 crc kubenswrapper[4725]: I0227 06:33:13.418750 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:13 crc kubenswrapper[4725]: I0227 06:33:13.464814 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.396829 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerStarted","Data":"492dbc47c86b452c153eeae62017fd4b9d194f2d8af57a9c220749c99abf3a5d"} Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.397444 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.397505 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="proxy-httpd" containerID="cri-o://492dbc47c86b452c153eeae62017fd4b9d194f2d8af57a9c220749c99abf3a5d" gracePeriod=30 Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.397747 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="sg-core" containerID="cri-o://28d6e194cd4a4cc3ddce257a1ccf8951cd0897ae33c9fcb80992466468f0cb05" gracePeriod=30 Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.398135 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-central-agent" containerID="cri-o://53281335f043dd76727011a00cb81e73631a16fd90be0c27b8ba1097c6c18bc3" gracePeriod=30 Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.398197 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-notification-agent" containerID="cri-o://e77b98329f194fb8e477c189e88b09e923caa8ea80e1bfb4673612c0781fb805" gracePeriod=30 Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.431593 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.443204066 podStartE2EDuration="5.431571498s" podCreationTimestamp="2026-02-27 06:33:09 +0000 UTC" firstStartedPulling="2026-02-27 06:33:10.513238718 +0000 UTC m=+1368.975859287" lastFinishedPulling="2026-02-27 06:33:13.50160615 +0000 UTC m=+1371.964226719" observedRunningTime="2026-02-27 06:33:14.427324658 +0000 UTC m=+1372.889945237" watchObservedRunningTime="2026-02-27 06:33:14.431571498 +0000 UTC m=+1372.894192077" Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.661786 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.661828 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.704991 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 06:33:14 crc kubenswrapper[4725]: I0227 06:33:14.705981 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.416789 4725 generic.go:334] "Generic (PLEG): container finished" podID="69224a92-7871-4eed-a4e2-610744faeb6b" containerID="492dbc47c86b452c153eeae62017fd4b9d194f2d8af57a9c220749c99abf3a5d" exitCode=0 Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.416842 4725 generic.go:334] "Generic (PLEG): container finished" podID="69224a92-7871-4eed-a4e2-610744faeb6b" containerID="28d6e194cd4a4cc3ddce257a1ccf8951cd0897ae33c9fcb80992466468f0cb05" exitCode=2 Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.416850 4725 generic.go:334] "Generic (PLEG): container finished" podID="69224a92-7871-4eed-a4e2-610744faeb6b" containerID="e77b98329f194fb8e477c189e88b09e923caa8ea80e1bfb4673612c0781fb805" exitCode=0 Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.418102 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerDied","Data":"492dbc47c86b452c153eeae62017fd4b9d194f2d8af57a9c220749c99abf3a5d"} Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.418131 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerDied","Data":"28d6e194cd4a4cc3ddce257a1ccf8951cd0897ae33c9fcb80992466468f0cb05"} Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.418145 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerDied","Data":"e77b98329f194fb8e477c189e88b09e923caa8ea80e1bfb4673612c0781fb805"} Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.418313 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" containerID="cri-o://b5a0eb19b930d1dba5f2c039111c6651bc5348f3b6040a2539df5e5bf88e1380" gracePeriod=30 Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.419928 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.420072 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.903656 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.903861 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="c42fc5be-c4bc-4ebc-8604-8d088212fbb5" containerName="watcher-applier" containerID="cri-o://f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2" gracePeriod=30 Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.942107 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.942356 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api-log" containerID="cri-o://2ee597a40feb69270dfb0da558b4dcc29edce818ba7e3d1462ef27563587d7b1" gracePeriod=30 Feb 27 06:33:15 crc kubenswrapper[4725]: I0227 06:33:15.942724 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api" containerID="cri-o://e497ebf419188b2973657e53aadfb8650798b12612159ae4d0ebc4f07718df87" gracePeriod=30 Feb 27 06:33:16 crc kubenswrapper[4725]: I0227 06:33:16.433136 4725 generic.go:334] "Generic (PLEG): container finished" podID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerID="2ee597a40feb69270dfb0da558b4dcc29edce818ba7e3d1462ef27563587d7b1" exitCode=143 Feb 27 06:33:16 crc kubenswrapper[4725]: I0227 06:33:16.433386 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c3439aa5-3edd-49b2-8d83-5a34cd55764b","Type":"ContainerDied","Data":"2ee597a40feb69270dfb0da558b4dcc29edce818ba7e3d1462ef27563587d7b1"} Feb 27 06:33:17 crc kubenswrapper[4725]: I0227 06:33:17.443865 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 06:33:17 crc kubenswrapper[4725]: I0227 06:33:17.445423 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 06:33:17 crc kubenswrapper[4725]: I0227 06:33:17.458722 4725 generic.go:334] "Generic (PLEG): container finished" podID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerID="e497ebf419188b2973657e53aadfb8650798b12612159ae4d0ebc4f07718df87" exitCode=0 Feb 27 06:33:17 crc kubenswrapper[4725]: I0227 06:33:17.458812 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c3439aa5-3edd-49b2-8d83-5a34cd55764b","Type":"ContainerDied","Data":"e497ebf419188b2973657e53aadfb8650798b12612159ae4d0ebc4f07718df87"} Feb 27 06:33:18 crc kubenswrapper[4725]: I0227 06:33:18.472043 4725 generic.go:334] "Generic (PLEG): container finished" podID="c42fc5be-c4bc-4ebc-8604-8d088212fbb5" containerID="f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2" exitCode=0 Feb 27 06:33:18 crc kubenswrapper[4725]: I0227 06:33:18.472130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"c42fc5be-c4bc-4ebc-8604-8d088212fbb5","Type":"ContainerDied","Data":"f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2"} Feb 27 06:33:18 crc kubenswrapper[4725]: E0227 06:33:18.893641 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2 is running failed: container process not found" containerID="f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 27 06:33:18 crc kubenswrapper[4725]: E0227 06:33:18.894199 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2 is running failed: container process not found" containerID="f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 27 06:33:18 crc kubenswrapper[4725]: E0227 06:33:18.894670 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2 is running failed: container process not found" containerID="f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 27 06:33:18 crc kubenswrapper[4725]: E0227 06:33:18.894700 4725 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="c42fc5be-c4bc-4ebc-8604-8d088212fbb5" containerName="watcher-applier" Feb 27 06:33:19 crc kubenswrapper[4725]: I0227 06:33:19.129328 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.180:9322/\": dial tcp 10.217.0.180:9322: connect: connection refused" Feb 27 06:33:19 crc kubenswrapper[4725]: I0227 06:33:19.129413 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.180:9322/\": dial tcp 10.217.0.180:9322: connect: connection refused" Feb 27 06:33:20 crc kubenswrapper[4725]: I0227 06:33:20.523669 4725 generic.go:334] "Generic (PLEG): container finished" podID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerID="b5a0eb19b930d1dba5f2c039111c6651bc5348f3b6040a2539df5e5bf88e1380" exitCode=0 Feb 27 06:33:20 crc kubenswrapper[4725]: I0227 06:33:20.523718 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerDied","Data":"b5a0eb19b930d1dba5f2c039111c6651bc5348f3b6040a2539df5e5bf88e1380"} Feb 27 06:33:20 crc kubenswrapper[4725]: I0227 06:33:20.523762 4725 scope.go:117] "RemoveContainer" containerID="fa5fb31e73f1069735d589b1a31d8db9214d99f3b1533767bab94eaa0181d6da" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.409814 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.413158 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.415828 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.524586 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-config-data\") pod \"98e9aa25-5670-466b-92a2-26b711b3ccf4\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.524667 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-logs\") pod \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525427 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-config-data\") pod \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525484 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-custom-prometheus-ca\") pod \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525540 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-config-data\") pod \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525561 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-combined-ca-bundle\") pod \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525600 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-combined-ca-bundle\") pod \"98e9aa25-5670-466b-92a2-26b711b3ccf4\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525625 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-combined-ca-bundle\") pod \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525665 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-custom-prometheus-ca\") pod \"98e9aa25-5670-466b-92a2-26b711b3ccf4\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525703 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3439aa5-3edd-49b2-8d83-5a34cd55764b-logs\") pod \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525742 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-public-tls-certs\") pod \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525758 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-internal-tls-certs\") pod \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525794 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98e9aa25-5670-466b-92a2-26b711b3ccf4-logs\") pod \"98e9aa25-5670-466b-92a2-26b711b3ccf4\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525819 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pbt8\" (UniqueName: \"kubernetes.io/projected/98e9aa25-5670-466b-92a2-26b711b3ccf4-kube-api-access-9pbt8\") pod \"98e9aa25-5670-466b-92a2-26b711b3ccf4\" (UID: \"98e9aa25-5670-466b-92a2-26b711b3ccf4\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525868 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl2fg\" (UniqueName: \"kubernetes.io/projected/c3439aa5-3edd-49b2-8d83-5a34cd55764b-kube-api-access-dl2fg\") pod \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\" (UID: \"c3439aa5-3edd-49b2-8d83-5a34cd55764b\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.525889 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlhvj\" (UniqueName: \"kubernetes.io/projected/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-kube-api-access-qlhvj\") pod \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\" (UID: \"c42fc5be-c4bc-4ebc-8604-8d088212fbb5\") " Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.531528 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3439aa5-3edd-49b2-8d83-5a34cd55764b-logs" (OuterVolumeSpecName: "logs") pod "c3439aa5-3edd-49b2-8d83-5a34cd55764b" (UID: "c3439aa5-3edd-49b2-8d83-5a34cd55764b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.531702 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-logs" (OuterVolumeSpecName: "logs") pod "c42fc5be-c4bc-4ebc-8604-8d088212fbb5" (UID: "c42fc5be-c4bc-4ebc-8604-8d088212fbb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.532193 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e9aa25-5670-466b-92a2-26b711b3ccf4-logs" (OuterVolumeSpecName: "logs") pod "98e9aa25-5670-466b-92a2-26b711b3ccf4" (UID: "98e9aa25-5670-466b-92a2-26b711b3ccf4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.534965 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-kube-api-access-qlhvj" (OuterVolumeSpecName: "kube-api-access-qlhvj") pod "c42fc5be-c4bc-4ebc-8604-8d088212fbb5" (UID: "c42fc5be-c4bc-4ebc-8604-8d088212fbb5"). InnerVolumeSpecName "kube-api-access-qlhvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.562278 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3439aa5-3edd-49b2-8d83-5a34cd55764b-kube-api-access-dl2fg" (OuterVolumeSpecName: "kube-api-access-dl2fg") pod "c3439aa5-3edd-49b2-8d83-5a34cd55764b" (UID: "c3439aa5-3edd-49b2-8d83-5a34cd55764b"). InnerVolumeSpecName "kube-api-access-dl2fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.562660 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e9aa25-5670-466b-92a2-26b711b3ccf4-kube-api-access-9pbt8" (OuterVolumeSpecName: "kube-api-access-9pbt8") pod "98e9aa25-5670-466b-92a2-26b711b3ccf4" (UID: "98e9aa25-5670-466b-92a2-26b711b3ccf4"). InnerVolumeSpecName "kube-api-access-9pbt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.567705 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"c42fc5be-c4bc-4ebc-8604-8d088212fbb5","Type":"ContainerDied","Data":"ef206f7fe3e449dd1ea9f4a2a81ed9b86e2d2ce31e52a5a6c3792422738668fd"} Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.567768 4725 scope.go:117] "RemoveContainer" containerID="f3a16ba147c8c17ecbe045b09ab6894a3a08a865a03e244b455d2c28c22caaf2" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.567954 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.575486 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c3439aa5-3edd-49b2-8d83-5a34cd55764b","Type":"ContainerDied","Data":"214e5b69dfd0a1b31b2603e521a20befd9ab7f0d0f617a81700e2785c2be79ed"} Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.575619 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.583112 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"98e9aa25-5670-466b-92a2-26b711b3ccf4","Type":"ContainerDied","Data":"5e03a47d41878c3e27e7745a3f685363d47fc205a380f355a45e39364eb32527"} Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.585435 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.611742 4725 scope.go:117] "RemoveContainer" containerID="e497ebf419188b2973657e53aadfb8650798b12612159ae4d0ebc4f07718df87" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.629270 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3439aa5-3edd-49b2-8d83-5a34cd55764b-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.629308 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98e9aa25-5670-466b-92a2-26b711b3ccf4-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.629317 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pbt8\" (UniqueName: \"kubernetes.io/projected/98e9aa25-5670-466b-92a2-26b711b3ccf4-kube-api-access-9pbt8\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.629326 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl2fg\" (UniqueName: \"kubernetes.io/projected/c3439aa5-3edd-49b2-8d83-5a34cd55764b-kube-api-access-dl2fg\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.629334 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlhvj\" (UniqueName: \"kubernetes.io/projected/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-kube-api-access-qlhvj\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.629935 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.644160 4725 scope.go:117] "RemoveContainer" containerID="2ee597a40feb69270dfb0da558b4dcc29edce818ba7e3d1462ef27563587d7b1" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.668533 4725 scope.go:117] "RemoveContainer" containerID="b5a0eb19b930d1dba5f2c039111c6651bc5348f3b6040a2539df5e5bf88e1380" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.741217 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c3439aa5-3edd-49b2-8d83-5a34cd55764b" (UID: "c3439aa5-3edd-49b2-8d83-5a34cd55764b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.768578 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-config-data" (OuterVolumeSpecName: "config-data") pod "c3439aa5-3edd-49b2-8d83-5a34cd55764b" (UID: "c3439aa5-3edd-49b2-8d83-5a34cd55764b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.770350 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c3439aa5-3edd-49b2-8d83-5a34cd55764b" (UID: "c3439aa5-3edd-49b2-8d83-5a34cd55764b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.772686 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98e9aa25-5670-466b-92a2-26b711b3ccf4" (UID: "98e9aa25-5670-466b-92a2-26b711b3ccf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.784242 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3439aa5-3edd-49b2-8d83-5a34cd55764b" (UID: "c3439aa5-3edd-49b2-8d83-5a34cd55764b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.793093 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c42fc5be-c4bc-4ebc-8604-8d088212fbb5" (UID: "c42fc5be-c4bc-4ebc-8604-8d088212fbb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.804206 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "98e9aa25-5670-466b-92a2-26b711b3ccf4" (UID: "98e9aa25-5670-466b-92a2-26b711b3ccf4"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.806929 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-config-data" (OuterVolumeSpecName: "config-data") pod "98e9aa25-5670-466b-92a2-26b711b3ccf4" (UID: "98e9aa25-5670-466b-92a2-26b711b3ccf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.808607 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c3439aa5-3edd-49b2-8d83-5a34cd55764b" (UID: "c3439aa5-3edd-49b2-8d83-5a34cd55764b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.815568 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-config-data" (OuterVolumeSpecName: "config-data") pod "c42fc5be-c4bc-4ebc-8604-8d088212fbb5" (UID: "c42fc5be-c4bc-4ebc-8604-8d088212fbb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833771 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833805 4725 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833817 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833826 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833835 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833842 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42fc5be-c4bc-4ebc-8604-8d088212fbb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833850 4725 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833858 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833866 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3439aa5-3edd-49b2-8d83-5a34cd55764b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.833875 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e9aa25-5670-466b-92a2-26b711b3ccf4-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.904777 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.929106 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.944959 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.955042 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.979805 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:33:21 crc kubenswrapper[4725]: E0227 06:33:21.980260 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980277 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: E0227 06:33:21.980315 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980322 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: E0227 06:33:21.980331 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980337 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: E0227 06:33:21.980347 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c42fc5be-c4bc-4ebc-8604-8d088212fbb5" containerName="watcher-applier" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980355 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c42fc5be-c4bc-4ebc-8604-8d088212fbb5" containerName="watcher-applier" Feb 27 06:33:21 crc kubenswrapper[4725]: E0227 06:33:21.980368 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980374 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api" Feb 27 06:33:21 crc kubenswrapper[4725]: E0227 06:33:21.980386 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api-log" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980391 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api-log" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980562 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c42fc5be-c4bc-4ebc-8604-8d088212fbb5" containerName="watcher-applier" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980575 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980587 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980599 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api-log" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980610 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" containerName="watcher-api" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980617 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.980638 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.981299 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.985901 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.986148 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-m7t9c" Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.993856 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:33:21 crc kubenswrapper[4725]: I0227 06:33:21.999192 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.028396 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.037348 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:33:22 crc kubenswrapper[4725]: E0227 06:33:22.037769 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.037783 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" containerName="watcher-decision-engine" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.039007 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.040760 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2f1b2e7-bd25-401a-ae31-c49984f2c438-logs\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.040823 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49777\" (UniqueName: \"kubernetes.io/projected/b2f1b2e7-bd25-401a-ae31-c49984f2c438-kube-api-access-49777\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.040864 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f1b2e7-bd25-401a-ae31-c49984f2c438-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.040914 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f1b2e7-bd25-401a-ae31-c49984f2c438-config-data\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.044752 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.044942 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.045613 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.046123 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.047469 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.051600 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.054415 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.084928 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.141952 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.141993 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142016 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2f1b2e7-bd25-401a-ae31-c49984f2c438-logs\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142044 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142070 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzrq\" (UniqueName: \"kubernetes.io/projected/8ca0e165-57b9-4dbd-a8a8-e036ba316122-kube-api-access-mqzrq\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142087 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142104 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-config-data\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142121 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crhvd\" (UniqueName: \"kubernetes.io/projected/a075032f-0182-44f6-8dd4-b190bf27ed02-kube-api-access-crhvd\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142153 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49777\" (UniqueName: \"kubernetes.io/projected/b2f1b2e7-bd25-401a-ae31-c49984f2c438-kube-api-access-49777\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142176 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-config-data\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142202 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-public-tls-certs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142224 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f1b2e7-bd25-401a-ae31-c49984f2c438-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142296 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f1b2e7-bd25-401a-ae31-c49984f2c438-config-data\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142312 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a075032f-0182-44f6-8dd4-b190bf27ed02-logs\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142346 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca0e165-57b9-4dbd-a8a8-e036ba316122-logs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142361 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.142960 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2f1b2e7-bd25-401a-ae31-c49984f2c438-logs\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.146897 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f1b2e7-bd25-401a-ae31-c49984f2c438-config-data\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.150263 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f1b2e7-bd25-401a-ae31-c49984f2c438-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.171029 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49777\" (UniqueName: \"kubernetes.io/projected/b2f1b2e7-bd25-401a-ae31-c49984f2c438-kube-api-access-49777\") pod \"watcher-applier-0\" (UID: \"b2f1b2e7-bd25-401a-ae31-c49984f2c438\") " pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243540 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-public-tls-certs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243640 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a075032f-0182-44f6-8dd4-b190bf27ed02-logs\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243675 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca0e165-57b9-4dbd-a8a8-e036ba316122-logs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243690 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243725 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243742 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243769 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243793 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzrq\" (UniqueName: \"kubernetes.io/projected/8ca0e165-57b9-4dbd-a8a8-e036ba316122-kube-api-access-mqzrq\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243811 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243826 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-config-data\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243842 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crhvd\" (UniqueName: \"kubernetes.io/projected/a075032f-0182-44f6-8dd4-b190bf27ed02-kube-api-access-crhvd\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.243878 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-config-data\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.244360 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a075032f-0182-44f6-8dd4-b190bf27ed02-logs\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.244636 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca0e165-57b9-4dbd-a8a8-e036ba316122-logs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.248760 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.249092 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.249216 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.249363 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.251353 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.252973 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.253122 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.253602 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.257728 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-public-tls-certs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.258028 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075032f-0182-44f6-8dd4-b190bf27ed02-config-data\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.258937 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-config-data\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.260249 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca0e165-57b9-4dbd-a8a8-e036ba316122-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.262778 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crhvd\" (UniqueName: \"kubernetes.io/projected/a075032f-0182-44f6-8dd4-b190bf27ed02-kube-api-access-crhvd\") pod \"watcher-decision-engine-0\" (UID: \"a075032f-0182-44f6-8dd4-b190bf27ed02\") " pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.266452 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzrq\" (UniqueName: \"kubernetes.io/projected/8ca0e165-57b9-4dbd-a8a8-e036ba316122-kube-api-access-mqzrq\") pod \"watcher-api-0\" (UID: \"8ca0e165-57b9-4dbd-a8a8-e036ba316122\") " pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.272990 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e9aa25-5670-466b-92a2-26b711b3ccf4" path="/var/lib/kubelet/pods/98e9aa25-5670-466b-92a2-26b711b3ccf4/volumes" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.273991 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3439aa5-3edd-49b2-8d83-5a34cd55764b" path="/var/lib/kubelet/pods/c3439aa5-3edd-49b2-8d83-5a34cd55764b/volumes" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.274787 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c42fc5be-c4bc-4ebc-8604-8d088212fbb5" path="/var/lib/kubelet/pods/c42fc5be-c4bc-4ebc-8604-8d088212fbb5/volumes" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.306826 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-m7t9c" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.317056 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.378741 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.395102 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.635798 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" event={"ID":"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010","Type":"ContainerStarted","Data":"b909b7e49a73595ec27c5d1a653740fb84a267c3e44c0b7eb176ff7cb3088e70"} Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.696130 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" podStartSLOduration=2.568390199 podStartE2EDuration="15.69610464s" podCreationTimestamp="2026-02-27 06:33:07 +0000 UTC" firstStartedPulling="2026-02-27 06:33:08.240667418 +0000 UTC m=+1366.703287987" lastFinishedPulling="2026-02-27 06:33:21.368381859 +0000 UTC m=+1379.831002428" observedRunningTime="2026-02-27 06:33:22.65774668 +0000 UTC m=+1381.120367259" watchObservedRunningTime="2026-02-27 06:33:22.69610464 +0000 UTC m=+1381.158725209" Feb 27 06:33:22 crc kubenswrapper[4725]: I0227 06:33:22.720956 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.017041 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.087197 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.645448 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b2f1b2e7-bd25-401a-ae31-c49984f2c438","Type":"ContainerStarted","Data":"802b7c393bc1df1476779f4eb9ad6fcaef24ac36f72af8d12c1a84aa31047710"} Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.645752 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b2f1b2e7-bd25-401a-ae31-c49984f2c438","Type":"ContainerStarted","Data":"22226cc204802e9a469f251dea128fe3a1bea7ec9c3e4c94e5ab06521708a5ad"} Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.646934 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8ca0e165-57b9-4dbd-a8a8-e036ba316122","Type":"ContainerStarted","Data":"b3fb6f39e89f4179c7cfdf9cdf96acefefd8a0b9d05ae9d0c5d0e1346e2085e4"} Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.646958 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8ca0e165-57b9-4dbd-a8a8-e036ba316122","Type":"ContainerStarted","Data":"7d56991234a47aa6b20ad772a8447f1d4c6ad65528a494540deca21f29937dd1"} Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.646968 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8ca0e165-57b9-4dbd-a8a8-e036ba316122","Type":"ContainerStarted","Data":"402479f100ece411d329b7ceb06de742cddc632f83e3880ae22888191bbf8fdc"} Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.647125 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.648079 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"a075032f-0182-44f6-8dd4-b190bf27ed02","Type":"ContainerStarted","Data":"7ea57e5d07e6b0d5387b956d42069f5e2aca4670a573aa9b33674c49da081c51"} Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.648100 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"a075032f-0182-44f6-8dd4-b190bf27ed02","Type":"ContainerStarted","Data":"2afe79d99feff2d632e5956e65218af7b7d1b6e31af4ef25bf1edfde6b1e35f1"} Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.669502 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.6694866409999998 podStartE2EDuration="2.669486641s" podCreationTimestamp="2026-02-27 06:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:23.66801652 +0000 UTC m=+1382.130637089" watchObservedRunningTime="2026-02-27 06:33:23.669486641 +0000 UTC m=+1382.132107220" Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.690775 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.69075847 podStartE2EDuration="2.69075847s" podCreationTimestamp="2026-02-27 06:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:23.687949051 +0000 UTC m=+1382.150569610" watchObservedRunningTime="2026-02-27 06:33:23.69075847 +0000 UTC m=+1382.153379039" Feb 27 06:33:23 crc kubenswrapper[4725]: I0227 06:33:23.721578 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.721555238 podStartE2EDuration="2.721555238s" podCreationTimestamp="2026-02-27 06:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:23.710123966 +0000 UTC m=+1382.172744555" watchObservedRunningTime="2026-02-27 06:33:23.721555238 +0000 UTC m=+1382.184175807" Feb 27 06:33:25 crc kubenswrapper[4725]: I0227 06:33:25.670638 4725 generic.go:334] "Generic (PLEG): container finished" podID="69224a92-7871-4eed-a4e2-610744faeb6b" containerID="53281335f043dd76727011a00cb81e73631a16fd90be0c27b8ba1097c6c18bc3" exitCode=0 Feb 27 06:33:25 crc kubenswrapper[4725]: I0227 06:33:25.670795 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerDied","Data":"53281335f043dd76727011a00cb81e73631a16fd90be0c27b8ba1097c6c18bc3"} Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.528764 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.892617 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.951133 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-config-data\") pod \"69224a92-7871-4eed-a4e2-610744faeb6b\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.951212 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-combined-ca-bundle\") pod \"69224a92-7871-4eed-a4e2-610744faeb6b\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.951273 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t88h8\" (UniqueName: \"kubernetes.io/projected/69224a92-7871-4eed-a4e2-610744faeb6b-kube-api-access-t88h8\") pod \"69224a92-7871-4eed-a4e2-610744faeb6b\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.951336 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-scripts\") pod \"69224a92-7871-4eed-a4e2-610744faeb6b\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.951390 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-sg-core-conf-yaml\") pod \"69224a92-7871-4eed-a4e2-610744faeb6b\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.951452 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-log-httpd\") pod \"69224a92-7871-4eed-a4e2-610744faeb6b\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.951610 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-run-httpd\") pod \"69224a92-7871-4eed-a4e2-610744faeb6b\" (UID: \"69224a92-7871-4eed-a4e2-610744faeb6b\") " Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.952710 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69224a92-7871-4eed-a4e2-610744faeb6b" (UID: "69224a92-7871-4eed-a4e2-610744faeb6b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.956174 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69224a92-7871-4eed-a4e2-610744faeb6b" (UID: "69224a92-7871-4eed-a4e2-610744faeb6b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:26 crc kubenswrapper[4725]: I0227 06:33:26.982421 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-scripts" (OuterVolumeSpecName: "scripts") pod "69224a92-7871-4eed-a4e2-610744faeb6b" (UID: "69224a92-7871-4eed-a4e2-610744faeb6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:26.995576 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69224a92-7871-4eed-a4e2-610744faeb6b-kube-api-access-t88h8" (OuterVolumeSpecName: "kube-api-access-t88h8") pod "69224a92-7871-4eed-a4e2-610744faeb6b" (UID: "69224a92-7871-4eed-a4e2-610744faeb6b"). InnerVolumeSpecName "kube-api-access-t88h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.007261 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69224a92-7871-4eed-a4e2-610744faeb6b" (UID: "69224a92-7871-4eed-a4e2-610744faeb6b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.052459 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-config-data" (OuterVolumeSpecName: "config-data") pod "69224a92-7871-4eed-a4e2-610744faeb6b" (UID: "69224a92-7871-4eed-a4e2-610744faeb6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.053604 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.053625 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.053633 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69224a92-7871-4eed-a4e2-610744faeb6b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.053642 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.053650 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t88h8\" (UniqueName: \"kubernetes.io/projected/69224a92-7871-4eed-a4e2-610744faeb6b-kube-api-access-t88h8\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.053660 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.074308 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69224a92-7871-4eed-a4e2-610744faeb6b" (UID: "69224a92-7871-4eed-a4e2-610744faeb6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.155323 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69224a92-7871-4eed-a4e2-610744faeb6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.318398 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.379540 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.701962 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69224a92-7871-4eed-a4e2-610744faeb6b","Type":"ContainerDied","Data":"6495ea4c7f2f1c206ffef0d2db585ce1a989b11e1a12ca74fd579f11aa6fa892"} Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.702027 4725 scope.go:117] "RemoveContainer" containerID="492dbc47c86b452c153eeae62017fd4b9d194f2d8af57a9c220749c99abf3a5d" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.702234 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.761431 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.770162 4725 scope.go:117] "RemoveContainer" containerID="28d6e194cd4a4cc3ddce257a1ccf8951cd0897ae33c9fcb80992466468f0cb05" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.777699 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791040 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:27 crc kubenswrapper[4725]: E0227 06:33:27.791471 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-central-agent" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791488 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-central-agent" Feb 27 06:33:27 crc kubenswrapper[4725]: E0227 06:33:27.791508 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="sg-core" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791515 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="sg-core" Feb 27 06:33:27 crc kubenswrapper[4725]: E0227 06:33:27.791537 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-notification-agent" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791543 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-notification-agent" Feb 27 06:33:27 crc kubenswrapper[4725]: E0227 06:33:27.791550 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="proxy-httpd" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791556 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="proxy-httpd" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791743 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-central-agent" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791761 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="sg-core" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791770 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="ceilometer-notification-agent" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.791788 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" containerName="proxy-httpd" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.794465 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.797985 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.798683 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.808228 4725 scope.go:117] "RemoveContainer" containerID="e77b98329f194fb8e477c189e88b09e923caa8ea80e1bfb4673612c0781fb805" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.811124 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.868367 4725 scope.go:117] "RemoveContainer" containerID="53281335f043dd76727011a00cb81e73631a16fd90be0c27b8ba1097c6c18bc3" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.869388 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-scripts\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.869472 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.869495 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-run-httpd\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.869534 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-log-httpd\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.869585 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.869645 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-config-data\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.869664 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ccr\" (UniqueName: \"kubernetes.io/projected/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-kube-api-access-64ccr\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.970864 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.970914 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-run-httpd\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.970964 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-log-httpd\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.971027 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.971099 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-config-data\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.971121 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64ccr\" (UniqueName: \"kubernetes.io/projected/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-kube-api-access-64ccr\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.971172 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-scripts\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.972854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-log-httpd\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.974491 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-run-httpd\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.975743 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.976035 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-config-data\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.976758 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-scripts\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.977140 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:27 crc kubenswrapper[4725]: I0227 06:33:27.988362 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ccr\" (UniqueName: \"kubernetes.io/projected/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-kube-api-access-64ccr\") pod \"ceilometer-0\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " pod="openstack/ceilometer-0" Feb 27 06:33:28 crc kubenswrapper[4725]: I0227 06:33:28.053980 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:28 crc kubenswrapper[4725]: I0227 06:33:28.054749 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:28 crc kubenswrapper[4725]: I0227 06:33:28.272266 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69224a92-7871-4eed-a4e2-610744faeb6b" path="/var/lib/kubelet/pods/69224a92-7871-4eed-a4e2-610744faeb6b/volumes" Feb 27 06:33:28 crc kubenswrapper[4725]: I0227 06:33:28.564017 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:28 crc kubenswrapper[4725]: I0227 06:33:28.716893 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerStarted","Data":"6e77b1d7e836465504cf4f6a6b98e6a1e2a0b3b444ea93f882caad10741b3863"} Feb 27 06:33:29 crc kubenswrapper[4725]: I0227 06:33:29.732904 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerStarted","Data":"16703e9f6ce85bbf7de1720cf83d3384f582740f5f75a3ec7d4d76dc67ab8d37"} Feb 27 06:33:29 crc kubenswrapper[4725]: I0227 06:33:29.733313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerStarted","Data":"37c8b3edaa994724d0f91a40b0ee8f64e85203936b9836449c31bccc2501ece7"} Feb 27 06:33:30 crc kubenswrapper[4725]: I0227 06:33:30.752561 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerStarted","Data":"c25d2bb577953067b86345dd3d205fa2c10c7795f55dfdfc5a40bbdd34cb2f2b"} Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.317833 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.360873 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.380462 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.396204 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.401524 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.439102 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.817246 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-central-agent" containerID="cri-o://37c8b3edaa994724d0f91a40b0ee8f64e85203936b9836449c31bccc2501ece7" gracePeriod=30 Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.817481 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerStarted","Data":"c4808ccaf1af4b71c5a6aad48905f99e34c1fd2396a226e2e7daec60729a0883"} Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.818402 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.818650 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-notification-agent" containerID="cri-o://16703e9f6ce85bbf7de1720cf83d3384f582740f5f75a3ec7d4d76dc67ab8d37" gracePeriod=30 Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.818719 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="proxy-httpd" containerID="cri-o://c4808ccaf1af4b71c5a6aad48905f99e34c1fd2396a226e2e7daec60729a0883" gracePeriod=30 Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.818751 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="sg-core" containerID="cri-o://c25d2bb577953067b86345dd3d205fa2c10c7795f55dfdfc5a40bbdd34cb2f2b" gracePeriod=30 Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.818910 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.861225 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.460458524 podStartE2EDuration="5.861206975s" podCreationTimestamp="2026-02-27 06:33:27 +0000 UTC" firstStartedPulling="2026-02-27 06:33:28.568496227 +0000 UTC m=+1387.031116796" lastFinishedPulling="2026-02-27 06:33:31.969244678 +0000 UTC m=+1390.431865247" observedRunningTime="2026-02-27 06:33:32.85463972 +0000 UTC m=+1391.317260289" watchObservedRunningTime="2026-02-27 06:33:32.861206975 +0000 UTC m=+1391.323827544" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.862958 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.880843 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 27 06:33:32 crc kubenswrapper[4725]: I0227 06:33:32.960248 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 27 06:33:33 crc kubenswrapper[4725]: I0227 06:33:33.830163 4725 generic.go:334] "Generic (PLEG): container finished" podID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerID="c4808ccaf1af4b71c5a6aad48905f99e34c1fd2396a226e2e7daec60729a0883" exitCode=0 Feb 27 06:33:33 crc kubenswrapper[4725]: I0227 06:33:33.830438 4725 generic.go:334] "Generic (PLEG): container finished" podID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerID="c25d2bb577953067b86345dd3d205fa2c10c7795f55dfdfc5a40bbdd34cb2f2b" exitCode=2 Feb 27 06:33:33 crc kubenswrapper[4725]: I0227 06:33:33.830448 4725 generic.go:334] "Generic (PLEG): container finished" podID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerID="16703e9f6ce85bbf7de1720cf83d3384f582740f5f75a3ec7d4d76dc67ab8d37" exitCode=0 Feb 27 06:33:33 crc kubenswrapper[4725]: I0227 06:33:33.830227 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerDied","Data":"c4808ccaf1af4b71c5a6aad48905f99e34c1fd2396a226e2e7daec60729a0883"} Feb 27 06:33:33 crc kubenswrapper[4725]: I0227 06:33:33.830635 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerDied","Data":"c25d2bb577953067b86345dd3d205fa2c10c7795f55dfdfc5a40bbdd34cb2f2b"} Feb 27 06:33:33 crc kubenswrapper[4725]: I0227 06:33:33.830662 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerDied","Data":"16703e9f6ce85bbf7de1720cf83d3384f582740f5f75a3ec7d4d76dc67ab8d37"} Feb 27 06:33:36 crc kubenswrapper[4725]: I0227 06:33:36.873494 4725 generic.go:334] "Generic (PLEG): container finished" podID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerID="37c8b3edaa994724d0f91a40b0ee8f64e85203936b9836449c31bccc2501ece7" exitCode=0 Feb 27 06:33:36 crc kubenswrapper[4725]: I0227 06:33:36.873636 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerDied","Data":"37c8b3edaa994724d0f91a40b0ee8f64e85203936b9836449c31bccc2501ece7"} Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.153917 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.332007 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-log-httpd\") pod \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.335617 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-run-httpd\") pod \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.335706 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-sg-core-conf-yaml\") pod \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.332848 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" (UID: "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.335853 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" (UID: "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.335865 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-scripts\") pod \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.335931 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-combined-ca-bundle\") pod \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.335970 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-config-data\") pod \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.336272 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64ccr\" (UniqueName: \"kubernetes.io/projected/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-kube-api-access-64ccr\") pod \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\" (UID: \"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d\") " Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.337331 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.337355 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.361416 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-scripts" (OuterVolumeSpecName: "scripts") pod "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" (UID: "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.370591 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" (UID: "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.374837 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-kube-api-access-64ccr" (OuterVolumeSpecName: "kube-api-access-64ccr") pod "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" (UID: "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d"). InnerVolumeSpecName "kube-api-access-64ccr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.438656 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64ccr\" (UniqueName: \"kubernetes.io/projected/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-kube-api-access-64ccr\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.438938 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.438953 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.447557 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" (UID: "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.470887 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-config-data" (OuterVolumeSpecName: "config-data") pod "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" (UID: "2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.540800 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.541050 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.891749 4725 generic.go:334] "Generic (PLEG): container finished" podID="56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" containerID="b909b7e49a73595ec27c5d1a653740fb84a267c3e44c0b7eb176ff7cb3088e70" exitCode=0 Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.891838 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" event={"ID":"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010","Type":"ContainerDied","Data":"b909b7e49a73595ec27c5d1a653740fb84a267c3e44c0b7eb176ff7cb3088e70"} Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.897894 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d","Type":"ContainerDied","Data":"6e77b1d7e836465504cf4f6a6b98e6a1e2a0b3b444ea93f882caad10741b3863"} Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.897958 4725 scope.go:117] "RemoveContainer" containerID="c4808ccaf1af4b71c5a6aad48905f99e34c1fd2396a226e2e7daec60729a0883" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.898155 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.939417 4725 scope.go:117] "RemoveContainer" containerID="c25d2bb577953067b86345dd3d205fa2c10c7795f55dfdfc5a40bbdd34cb2f2b" Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.962470 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.968876 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:37 crc kubenswrapper[4725]: I0227 06:33:37.974934 4725 scope.go:117] "RemoveContainer" containerID="16703e9f6ce85bbf7de1720cf83d3384f582740f5f75a3ec7d4d76dc67ab8d37" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.009142 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:38 crc kubenswrapper[4725]: E0227 06:33:38.009828 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="sg-core" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.009857 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="sg-core" Feb 27 06:33:38 crc kubenswrapper[4725]: E0227 06:33:38.009912 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-central-agent" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.009926 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-central-agent" Feb 27 06:33:38 crc kubenswrapper[4725]: E0227 06:33:38.009950 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-notification-agent" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.009962 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-notification-agent" Feb 27 06:33:38 crc kubenswrapper[4725]: E0227 06:33:38.009992 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="proxy-httpd" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.010005 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="proxy-httpd" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.010399 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-central-agent" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.010442 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="sg-core" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.010461 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="proxy-httpd" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.010495 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" containerName="ceilometer-notification-agent" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.013738 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.017199 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.017436 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.019473 4725 scope.go:117] "RemoveContainer" containerID="37c8b3edaa994724d0f91a40b0ee8f64e85203936b9836449c31bccc2501ece7" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.024522 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.159722 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.159780 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-run-httpd\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.159820 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.159886 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-config-data\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.159902 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-scripts\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.159924 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-log-httpd\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.159940 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjc8\" (UniqueName: \"kubernetes.io/projected/970a7b0c-8b62-418e-8317-b408eb9e70a0-kube-api-access-wsjc8\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.261585 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.261741 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-config-data\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.261775 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-scripts\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.261931 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-log-httpd\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.261971 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjc8\" (UniqueName: \"kubernetes.io/projected/970a7b0c-8b62-418e-8317-b408eb9e70a0-kube-api-access-wsjc8\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.262148 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.262215 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-run-httpd\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.262742 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-log-httpd\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.265402 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-run-httpd\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.278833 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.278948 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-scripts\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.279160 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-config-data\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.283587 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjc8\" (UniqueName: \"kubernetes.io/projected/970a7b0c-8b62-418e-8317-b408eb9e70a0-kube-api-access-wsjc8\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.284032 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " pod="openstack/ceilometer-0" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.287309 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d" path="/var/lib/kubelet/pods/2c8eb224-58bd-49ef-8667-5b1ecb3a8f2d/volumes" Feb 27 06:33:38 crc kubenswrapper[4725]: I0227 06:33:38.352822 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:38.655899 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:33:39 crc kubenswrapper[4725]: W0227 06:33:38.660921 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod970a7b0c_8b62_418e_8317_b408eb9e70a0.slice/crio-ac1a9d96f5d6900239ce4eda668a79e4abe90ebe43580b238862ec19e70a8a84 WatchSource:0}: Error finding container ac1a9d96f5d6900239ce4eda668a79e4abe90ebe43580b238862ec19e70a8a84: Status 404 returned error can't find the container with id ac1a9d96f5d6900239ce4eda668a79e4abe90ebe43580b238862ec19e70a8a84 Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:38.910909 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerStarted","Data":"ac1a9d96f5d6900239ce4eda668a79e4abe90ebe43580b238862ec19e70a8a84"} Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.440724 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.593281 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-combined-ca-bundle\") pod \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.593400 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzvdl\" (UniqueName: \"kubernetes.io/projected/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-kube-api-access-xzvdl\") pod \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.593595 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-config-data\") pod \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.593627 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-scripts\") pod \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\" (UID: \"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010\") " Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.598756 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-kube-api-access-xzvdl" (OuterVolumeSpecName: "kube-api-access-xzvdl") pod "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" (UID: "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010"). InnerVolumeSpecName "kube-api-access-xzvdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.598877 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-scripts" (OuterVolumeSpecName: "scripts") pod "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" (UID: "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.620481 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" (UID: "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.621093 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-config-data" (OuterVolumeSpecName: "config-data") pod "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" (UID: "56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.695787 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.695823 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.695832 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.695843 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzvdl\" (UniqueName: \"kubernetes.io/projected/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010-kube-api-access-xzvdl\") on node \"crc\" DevicePath \"\"" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.923016 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" event={"ID":"56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010","Type":"ContainerDied","Data":"cdb92369ded4a016e124204d83dc05ccdf72cd314b0d8cf720972762d2d598ce"} Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.923807 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb92369ded4a016e124204d83dc05ccdf72cd314b0d8cf720972762d2d598ce" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.923045 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kjl9w" Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.926957 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerStarted","Data":"3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356"} Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.927002 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerStarted","Data":"db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7"} Feb 27 06:33:39 crc kubenswrapper[4725]: I0227 06:33:39.997123 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 06:33:40 crc kubenswrapper[4725]: E0227 06:33:40.002088 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" containerName="nova-cell0-conductor-db-sync" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.002109 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" containerName="nova-cell0-conductor-db-sync" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.002330 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" containerName="nova-cell0-conductor-db-sync" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.002928 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.006501 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zzbqh" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.006730 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.014418 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.105985 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20427622-030f-4f0a-870e-6119d307befa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.106106 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm6vg\" (UniqueName: \"kubernetes.io/projected/20427622-030f-4f0a-870e-6119d307befa-kube-api-access-fm6vg\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.106251 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20427622-030f-4f0a-870e-6119d307befa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.208239 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20427622-030f-4f0a-870e-6119d307befa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.208311 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20427622-030f-4f0a-870e-6119d307befa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.208367 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm6vg\" (UniqueName: \"kubernetes.io/projected/20427622-030f-4f0a-870e-6119d307befa-kube-api-access-fm6vg\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.212988 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20427622-030f-4f0a-870e-6119d307befa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.213032 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20427622-030f-4f0a-870e-6119d307befa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.250861 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm6vg\" (UniqueName: \"kubernetes.io/projected/20427622-030f-4f0a-870e-6119d307befa-kube-api-access-fm6vg\") pod \"nova-cell0-conductor-0\" (UID: \"20427622-030f-4f0a-870e-6119d307befa\") " pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.321684 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:40 crc kubenswrapper[4725]: W0227 06:33:40.868280 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20427622_030f_4f0a_870e_6119d307befa.slice/crio-5a908fec1416210603765f37b47736d0d4e3e6c3e35d1041189ac60d5a91e855 WatchSource:0}: Error finding container 5a908fec1416210603765f37b47736d0d4e3e6c3e35d1041189ac60d5a91e855: Status 404 returned error can't find the container with id 5a908fec1416210603765f37b47736d0d4e3e6c3e35d1041189ac60d5a91e855 Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.887478 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.941789 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"20427622-030f-4f0a-870e-6119d307befa","Type":"ContainerStarted","Data":"5a908fec1416210603765f37b47736d0d4e3e6c3e35d1041189ac60d5a91e855"} Feb 27 06:33:40 crc kubenswrapper[4725]: I0227 06:33:40.945781 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerStarted","Data":"9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b"} Feb 27 06:33:41 crc kubenswrapper[4725]: I0227 06:33:41.955880 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"20427622-030f-4f0a-870e-6119d307befa","Type":"ContainerStarted","Data":"80f5fe277c5dc50c448ae5eb227a2fb3107a5c9e3dff85371b71b20510946d72"} Feb 27 06:33:41 crc kubenswrapper[4725]: I0227 06:33:41.956205 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:41 crc kubenswrapper[4725]: I0227 06:33:41.981496 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.981479235 podStartE2EDuration="2.981479235s" podCreationTimestamp="2026-02-27 06:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:41.977125412 +0000 UTC m=+1400.439745981" watchObservedRunningTime="2026-02-27 06:33:41.981479235 +0000 UTC m=+1400.444099804" Feb 27 06:33:42 crc kubenswrapper[4725]: I0227 06:33:42.969764 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerStarted","Data":"3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc"} Feb 27 06:33:42 crc kubenswrapper[4725]: I0227 06:33:42.996010 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.261192645 podStartE2EDuration="5.995989404s" podCreationTimestamp="2026-02-27 06:33:37 +0000 UTC" firstStartedPulling="2026-02-27 06:33:38.663639161 +0000 UTC m=+1397.126259730" lastFinishedPulling="2026-02-27 06:33:42.39843592 +0000 UTC m=+1400.861056489" observedRunningTime="2026-02-27 06:33:42.989729978 +0000 UTC m=+1401.452350567" watchObservedRunningTime="2026-02-27 06:33:42.995989404 +0000 UTC m=+1401.458609973" Feb 27 06:33:43 crc kubenswrapper[4725]: I0227 06:33:43.978595 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 06:33:50 crc kubenswrapper[4725]: I0227 06:33:50.375190 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.159339 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-g7bnn"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.162083 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.166876 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.169094 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.174526 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-g7bnn"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.295010 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-config-data\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.295095 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsq7d\" (UniqueName: \"kubernetes.io/projected/29c41298-c7b2-4ea5-b628-ac1149cfa7da-kube-api-access-lsq7d\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.295167 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.295195 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-scripts\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.348642 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.353452 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.358634 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.360015 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.397463 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-config-data\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.397581 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsq7d\" (UniqueName: \"kubernetes.io/projected/29c41298-c7b2-4ea5-b628-ac1149cfa7da-kube-api-access-lsq7d\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.397650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.397692 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-scripts\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.404381 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-config-data\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.404724 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-scripts\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.413230 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.428996 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.430794 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.438673 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.444619 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.468099 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsq7d\" (UniqueName: \"kubernetes.io/projected/29c41298-c7b2-4ea5-b628-ac1149cfa7da-kube-api-access-lsq7d\") pod \"nova-cell0-cell-mapping-g7bnn\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.486320 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.487603 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.512056 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.522481 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.522704 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-config-data\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.522805 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80307e35-9ed6-44cb-81e6-157d907abb4d-logs\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.522903 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76zmh\" (UniqueName: \"kubernetes.io/projected/80307e35-9ed6-44cb-81e6-157d907abb4d-kube-api-access-76zmh\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.523651 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.526164 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.629541 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.633213 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634492 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634617 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-config-data\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634685 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shdlb\" (UniqueName: \"kubernetes.io/projected/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-kube-api-access-shdlb\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634726 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80307e35-9ed6-44cb-81e6-157d907abb4d-logs\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634753 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-logs\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634815 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9wcl\" (UniqueName: \"kubernetes.io/projected/9d2a9b00-096a-40dc-8477-1fb98996f32a-kube-api-access-k9wcl\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634862 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76zmh\" (UniqueName: \"kubernetes.io/projected/80307e35-9ed6-44cb-81e6-157d907abb4d-kube-api-access-76zmh\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634928 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.634979 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-config-data\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.639057 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80307e35-9ed6-44cb-81e6-157d907abb4d-logs\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.640892 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.645205 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-config-data\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.662531 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.697103 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76zmh\" (UniqueName: \"kubernetes.io/projected/80307e35-9ed6-44cb-81e6-157d907abb4d-kube-api-access-76zmh\") pod \"nova-api-0\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.712968 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.716879 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.723229 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shdlb\" (UniqueName: \"kubernetes.io/projected/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-kube-api-access-shdlb\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753053 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-config-data\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753073 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-logs\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753111 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9wcl\" (UniqueName: \"kubernetes.io/projected/9d2a9b00-096a-40dc-8477-1fb98996f32a-kube-api-access-k9wcl\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753149 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753179 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753214 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-config-data\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753258 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753299 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.753346 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnq5b\" (UniqueName: \"kubernetes.io/projected/0b360a03-f3d8-4111-a292-7c6d843c5897-kube-api-access-wnq5b\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.754183 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-logs\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.757645 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.757963 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-config-data\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.761394 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.763528 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.763605 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f575c69f9-k8vk2"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.766818 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.778621 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f575c69f9-k8vk2"] Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.783407 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shdlb\" (UniqueName: \"kubernetes.io/projected/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-kube-api-access-shdlb\") pod \"nova-metadata-0\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " pod="openstack/nova-metadata-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.784169 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9wcl\" (UniqueName: \"kubernetes.io/projected/9d2a9b00-096a-40dc-8477-1fb98996f32a-kube-api-access-k9wcl\") pod \"nova-cell1-novncproxy-0\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.854906 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-config\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.854971 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-config-data\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.854994 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-nb\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.855045 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.855086 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-svc\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.855111 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh24v\" (UniqueName: \"kubernetes.io/projected/837a7d0c-6989-4582-9d05-8b5c73db83a2-kube-api-access-jh24v\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.855139 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-sb\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.855172 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-swift-storage-0\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.855197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnq5b\" (UniqueName: \"kubernetes.io/projected/0b360a03-f3d8-4111-a292-7c6d843c5897-kube-api-access-wnq5b\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.858939 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-config-data\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.859982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.871268 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnq5b\" (UniqueName: \"kubernetes.io/projected/0b360a03-f3d8-4111-a292-7c6d843c5897-kube-api-access-wnq5b\") pod \"nova-scheduler-0\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " pod="openstack/nova-scheduler-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.956869 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-nb\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.957276 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-svc\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.957343 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh24v\" (UniqueName: \"kubernetes.io/projected/837a7d0c-6989-4582-9d05-8b5c73db83a2-kube-api-access-jh24v\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.957380 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-sb\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.957424 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-swift-storage-0\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.957487 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-config\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.957959 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-nb\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.958790 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-svc\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.959354 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-sb\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.961412 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-swift-storage-0\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.961628 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-config\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.970752 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:33:51 crc kubenswrapper[4725]: I0227 06:33:51.992920 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh24v\" (UniqueName: \"kubernetes.io/projected/837a7d0c-6989-4582-9d05-8b5c73db83a2-kube-api-access-jh24v\") pod \"dnsmasq-dns-6f575c69f9-k8vk2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.004754 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.011809 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.056891 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.103880 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.195782 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-g7bnn"] Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.288039 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dhc4"] Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.290064 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.301860 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.301860 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.307702 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dhc4"] Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.394636 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-config-data\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.394722 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-scripts\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.394741 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trh4x\" (UniqueName: \"kubernetes.io/projected/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-kube-api-access-trh4x\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.394822 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.489022 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.497279 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-scripts\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.497340 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trh4x\" (UniqueName: \"kubernetes.io/projected/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-kube-api-access-trh4x\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.497421 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.497480 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-config-data\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.501039 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-scripts\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.506402 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.508871 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-config-data\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.520741 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trh4x\" (UniqueName: \"kubernetes.io/projected/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-kube-api-access-trh4x\") pod \"nova-cell1-conductor-db-sync-9dhc4\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.640922 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.688127 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.715926 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:33:52 crc kubenswrapper[4725]: W0227 06:33:52.724676 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d2a9b00_096a_40dc_8477_1fb98996f32a.slice/crio-686ebf5c96ed321ce900ef829e61f96a7a3d17270c109de2eb54a7285dfef76f WatchSource:0}: Error finding container 686ebf5c96ed321ce900ef829e61f96a7a3d17270c109de2eb54a7285dfef76f: Status 404 returned error can't find the container with id 686ebf5c96ed321ce900ef829e61f96a7a3d17270c109de2eb54a7285dfef76f Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.849318 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f575c69f9-k8vk2"] Feb 27 06:33:52 crc kubenswrapper[4725]: W0227 06:33:52.872819 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod837a7d0c_6989_4582_9d05_8b5c73db83a2.slice/crio-1b1cc66f600bc539e474badecc623129859a77593b7c79139864e185b900188f WatchSource:0}: Error finding container 1b1cc66f600bc539e474badecc623129859a77593b7c79139864e185b900188f: Status 404 returned error can't find the container with id 1b1cc66f600bc539e474badecc623129859a77593b7c79139864e185b900188f Feb 27 06:33:52 crc kubenswrapper[4725]: I0227 06:33:52.948175 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:33:52 crc kubenswrapper[4725]: W0227 06:33:52.951251 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b360a03_f3d8_4111_a292_7c6d843c5897.slice/crio-6d166bb84c9b855e0b80e4cfd7a722ac72f412f0c9eedfad6283192cf1cf956f WatchSource:0}: Error finding container 6d166bb84c9b855e0b80e4cfd7a722ac72f412f0c9eedfad6283192cf1cf956f: Status 404 returned error can't find the container with id 6d166bb84c9b855e0b80e4cfd7a722ac72f412f0c9eedfad6283192cf1cf956f Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.103683 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" event={"ID":"837a7d0c-6989-4582-9d05-8b5c73db83a2","Type":"ContainerStarted","Data":"1b1cc66f600bc539e474badecc623129859a77593b7c79139864e185b900188f"} Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.114779 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b360a03-f3d8-4111-a292-7c6d843c5897","Type":"ContainerStarted","Data":"6d166bb84c9b855e0b80e4cfd7a722ac72f412f0c9eedfad6283192cf1cf956f"} Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.121998 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80307e35-9ed6-44cb-81e6-157d907abb4d","Type":"ContainerStarted","Data":"ba580f9be45255970955eb03b05df609f433ea0cbfb973d5e374d1cbe932fc11"} Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.128151 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g7bnn" event={"ID":"29c41298-c7b2-4ea5-b628-ac1149cfa7da","Type":"ContainerStarted","Data":"07f6abf4996093c8ae50443c097e75baa035616a4ebcfb705657f9176d11d77b"} Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.128207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g7bnn" event={"ID":"29c41298-c7b2-4ea5-b628-ac1149cfa7da","Type":"ContainerStarted","Data":"9865cadecf32df7a844a0cb3ce9c51e48a11ca6031ffcc370df1f331a1fcc34f"} Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.134955 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d2a9b00-096a-40dc-8477-1fb98996f32a","Type":"ContainerStarted","Data":"686ebf5c96ed321ce900ef829e61f96a7a3d17270c109de2eb54a7285dfef76f"} Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.140744 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b20acebe-3d2a-4342-9f32-2b12d70bfd1b","Type":"ContainerStarted","Data":"cdf093296091f994c845935e55c7a083826b351222ae339d8130bcde1a62a6b6"} Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.153696 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-g7bnn" podStartSLOduration=2.153680108 podStartE2EDuration="2.153680108s" podCreationTimestamp="2026-02-27 06:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:53.149302674 +0000 UTC m=+1411.611923243" watchObservedRunningTime="2026-02-27 06:33:53.153680108 +0000 UTC m=+1411.616300687" Feb 27 06:33:53 crc kubenswrapper[4725]: I0227 06:33:53.271423 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dhc4"] Feb 27 06:33:53 crc kubenswrapper[4725]: W0227 06:33:53.282875 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd6a75bb_19e2_4fa9_894c_a89271aa9c50.slice/crio-d907907b1982923cf68eab30410bdfb65d53d59eb0a07e6357f10e769c8098c1 WatchSource:0}: Error finding container d907907b1982923cf68eab30410bdfb65d53d59eb0a07e6357f10e769c8098c1: Status 404 returned error can't find the container with id d907907b1982923cf68eab30410bdfb65d53d59eb0a07e6357f10e769c8098c1 Feb 27 06:33:54 crc kubenswrapper[4725]: I0227 06:33:54.157719 4725 generic.go:334] "Generic (PLEG): container finished" podID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerID="bf811d13d002ced308326ee58960f034e249e82b4acebe5e02eec0b05d312fef" exitCode=0 Feb 27 06:33:54 crc kubenswrapper[4725]: I0227 06:33:54.158089 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" event={"ID":"837a7d0c-6989-4582-9d05-8b5c73db83a2","Type":"ContainerDied","Data":"bf811d13d002ced308326ee58960f034e249e82b4acebe5e02eec0b05d312fef"} Feb 27 06:33:54 crc kubenswrapper[4725]: I0227 06:33:54.160690 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" event={"ID":"fd6a75bb-19e2-4fa9-894c-a89271aa9c50","Type":"ContainerStarted","Data":"334dee8c7e9b066f350f06f9e395615502274e2babcb569fd49c027877e81378"} Feb 27 06:33:54 crc kubenswrapper[4725]: I0227 06:33:54.160740 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" event={"ID":"fd6a75bb-19e2-4fa9-894c-a89271aa9c50","Type":"ContainerStarted","Data":"d907907b1982923cf68eab30410bdfb65d53d59eb0a07e6357f10e769c8098c1"} Feb 27 06:33:54 crc kubenswrapper[4725]: I0227 06:33:54.209597 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" podStartSLOduration=2.209579922 podStartE2EDuration="2.209579922s" podCreationTimestamp="2026-02-27 06:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:54.203206592 +0000 UTC m=+1412.665827181" watchObservedRunningTime="2026-02-27 06:33:54.209579922 +0000 UTC m=+1412.672200491" Feb 27 06:33:55 crc kubenswrapper[4725]: I0227 06:33:55.066770 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:33:55 crc kubenswrapper[4725]: I0227 06:33:55.077798 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.201146 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b20acebe-3d2a-4342-9f32-2b12d70bfd1b","Type":"ContainerStarted","Data":"829e3fd17516b62cd0b9a2a3bda351e02af1a836238e88f3a7d78e0bc339eb56"} Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.201602 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b20acebe-3d2a-4342-9f32-2b12d70bfd1b","Type":"ContainerStarted","Data":"d2f10201c241ff6af707f25696d6b624017f53cc5590dac5e80941042f4faad5"} Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.201708 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-log" containerID="cri-o://d2f10201c241ff6af707f25696d6b624017f53cc5590dac5e80941042f4faad5" gracePeriod=30 Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.202177 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-metadata" containerID="cri-o://829e3fd17516b62cd0b9a2a3bda351e02af1a836238e88f3a7d78e0bc339eb56" gracePeriod=30 Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.208265 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80307e35-9ed6-44cb-81e6-157d907abb4d","Type":"ContainerStarted","Data":"9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b"} Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.208346 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80307e35-9ed6-44cb-81e6-157d907abb4d","Type":"ContainerStarted","Data":"482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6"} Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.212006 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d2a9b00-096a-40dc-8477-1fb98996f32a","Type":"ContainerStarted","Data":"9d98baf31598be01014934cf4857b43b2a77e64c204543b5d7a508b969930958"} Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.212149 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9d2a9b00-096a-40dc-8477-1fb98996f32a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9d98baf31598be01014934cf4857b43b2a77e64c204543b5d7a508b969930958" gracePeriod=30 Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.218020 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" event={"ID":"837a7d0c-6989-4582-9d05-8b5c73db83a2","Type":"ContainerStarted","Data":"fd1998b0e1b622db24ec2e1af09893d4ac39704d70bc87e391e687da67841d45"} Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.218603 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.221020 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b360a03-f3d8-4111-a292-7c6d843c5897","Type":"ContainerStarted","Data":"09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4"} Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.224006 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.556966788 podStartE2EDuration="6.223993789s" podCreationTimestamp="2026-02-27 06:33:51 +0000 UTC" firstStartedPulling="2026-02-27 06:33:52.753847934 +0000 UTC m=+1411.216468503" lastFinishedPulling="2026-02-27 06:33:56.420874935 +0000 UTC m=+1414.883495504" observedRunningTime="2026-02-27 06:33:57.223121554 +0000 UTC m=+1415.685742163" watchObservedRunningTime="2026-02-27 06:33:57.223993789 +0000 UTC m=+1415.686614348" Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.260915 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.366483483 podStartE2EDuration="6.260894959s" podCreationTimestamp="2026-02-27 06:33:51 +0000 UTC" firstStartedPulling="2026-02-27 06:33:52.525133811 +0000 UTC m=+1410.987754380" lastFinishedPulling="2026-02-27 06:33:56.419545247 +0000 UTC m=+1414.882165856" observedRunningTime="2026-02-27 06:33:57.253810849 +0000 UTC m=+1415.716431428" watchObservedRunningTime="2026-02-27 06:33:57.260894959 +0000 UTC m=+1415.723515528" Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.271953 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.624986725 podStartE2EDuration="6.27193555s" podCreationTimestamp="2026-02-27 06:33:51 +0000 UTC" firstStartedPulling="2026-02-27 06:33:52.754241665 +0000 UTC m=+1411.216862234" lastFinishedPulling="2026-02-27 06:33:56.40119048 +0000 UTC m=+1414.863811059" observedRunningTime="2026-02-27 06:33:57.269615194 +0000 UTC m=+1415.732235773" watchObservedRunningTime="2026-02-27 06:33:57.27193555 +0000 UTC m=+1415.734556119" Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.304262 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" podStartSLOduration=6.30423829 podStartE2EDuration="6.30423829s" podCreationTimestamp="2026-02-27 06:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:33:57.295493343 +0000 UTC m=+1415.758113912" watchObservedRunningTime="2026-02-27 06:33:57.30423829 +0000 UTC m=+1415.766858859" Feb 27 06:33:57 crc kubenswrapper[4725]: I0227 06:33:57.330574 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.872475387 podStartE2EDuration="6.330558351s" podCreationTimestamp="2026-02-27 06:33:51 +0000 UTC" firstStartedPulling="2026-02-27 06:33:52.956882064 +0000 UTC m=+1411.419502633" lastFinishedPulling="2026-02-27 06:33:56.414965028 +0000 UTC m=+1414.877585597" observedRunningTime="2026-02-27 06:33:57.317794421 +0000 UTC m=+1415.780415000" watchObservedRunningTime="2026-02-27 06:33:57.330558351 +0000 UTC m=+1415.793178920" Feb 27 06:33:58 crc kubenswrapper[4725]: I0227 06:33:58.236510 4725 generic.go:334] "Generic (PLEG): container finished" podID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerID="d2f10201c241ff6af707f25696d6b624017f53cc5590dac5e80941042f4faad5" exitCode=143 Feb 27 06:33:58 crc kubenswrapper[4725]: I0227 06:33:58.236562 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b20acebe-3d2a-4342-9f32-2b12d70bfd1b","Type":"ContainerDied","Data":"d2f10201c241ff6af707f25696d6b624017f53cc5590dac5e80941042f4faad5"} Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.194259 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536234-868m6"] Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.195986 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536234-868m6" Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.199928 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.200627 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.201017 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.201565 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536234-868m6"] Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.285725 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrlq2\" (UniqueName: \"kubernetes.io/projected/4b093247-2faf-4e4f-9164-24ac3654ca46-kube-api-access-hrlq2\") pod \"auto-csr-approver-29536234-868m6\" (UID: \"4b093247-2faf-4e4f-9164-24ac3654ca46\") " pod="openshift-infra/auto-csr-approver-29536234-868m6" Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.392052 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrlq2\" (UniqueName: \"kubernetes.io/projected/4b093247-2faf-4e4f-9164-24ac3654ca46-kube-api-access-hrlq2\") pod \"auto-csr-approver-29536234-868m6\" (UID: \"4b093247-2faf-4e4f-9164-24ac3654ca46\") " pod="openshift-infra/auto-csr-approver-29536234-868m6" Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.430122 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrlq2\" (UniqueName: \"kubernetes.io/projected/4b093247-2faf-4e4f-9164-24ac3654ca46-kube-api-access-hrlq2\") pod \"auto-csr-approver-29536234-868m6\" (UID: \"4b093247-2faf-4e4f-9164-24ac3654ca46\") " pod="openshift-infra/auto-csr-approver-29536234-868m6" Feb 27 06:34:00 crc kubenswrapper[4725]: I0227 06:34:00.516368 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536234-868m6" Feb 27 06:34:01 crc kubenswrapper[4725]: I0227 06:34:01.057576 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536234-868m6"] Feb 27 06:34:01 crc kubenswrapper[4725]: I0227 06:34:01.269420 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536234-868m6" event={"ID":"4b093247-2faf-4e4f-9164-24ac3654ca46","Type":"ContainerStarted","Data":"b715d7419b958883d2781ada968719a048fd02220091ea56c178d9711c49a24a"} Feb 27 06:34:01 crc kubenswrapper[4725]: I0227 06:34:01.272945 4725 generic.go:334] "Generic (PLEG): container finished" podID="29c41298-c7b2-4ea5-b628-ac1149cfa7da" containerID="07f6abf4996093c8ae50443c097e75baa035616a4ebcfb705657f9176d11d77b" exitCode=0 Feb 27 06:34:01 crc kubenswrapper[4725]: I0227 06:34:01.272999 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g7bnn" event={"ID":"29c41298-c7b2-4ea5-b628-ac1149cfa7da","Type":"ContainerDied","Data":"07f6abf4996093c8ae50443c097e75baa035616a4ebcfb705657f9176d11d77b"} Feb 27 06:34:01 crc kubenswrapper[4725]: I0227 06:34:01.972579 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:34:01 crc kubenswrapper[4725]: I0227 06:34:01.972819 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.006027 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.006081 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.012272 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.057964 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.058057 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.101328 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.105494 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.173819 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-674d64fdcf-wfmcl"] Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.174648 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" podUID="29494296-f5e8-4f29-8123-83f487cace05" containerName="dnsmasq-dns" containerID="cri-o://378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e" gracePeriod=10 Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.359646 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.799630 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.866215 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.951118 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc9qs\" (UniqueName: \"kubernetes.io/projected/29494296-f5e8-4f29-8123-83f487cace05-kube-api-access-dc9qs\") pod \"29494296-f5e8-4f29-8123-83f487cace05\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.951556 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-swift-storage-0\") pod \"29494296-f5e8-4f29-8123-83f487cace05\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.951747 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-config-data\") pod \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.951962 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-config\") pod \"29494296-f5e8-4f29-8123-83f487cace05\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.952088 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-svc\") pod \"29494296-f5e8-4f29-8123-83f487cace05\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.952282 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-scripts\") pod \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.952501 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsq7d\" (UniqueName: \"kubernetes.io/projected/29c41298-c7b2-4ea5-b628-ac1149cfa7da-kube-api-access-lsq7d\") pod \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.952601 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-combined-ca-bundle\") pod \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\" (UID: \"29c41298-c7b2-4ea5-b628-ac1149cfa7da\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.952681 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-nb\") pod \"29494296-f5e8-4f29-8123-83f487cace05\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.952824 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-sb\") pod \"29494296-f5e8-4f29-8123-83f487cace05\" (UID: \"29494296-f5e8-4f29-8123-83f487cace05\") " Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.959163 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-scripts" (OuterVolumeSpecName: "scripts") pod "29c41298-c7b2-4ea5-b628-ac1149cfa7da" (UID: "29c41298-c7b2-4ea5-b628-ac1149cfa7da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.959210 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29494296-f5e8-4f29-8123-83f487cace05-kube-api-access-dc9qs" (OuterVolumeSpecName: "kube-api-access-dc9qs") pod "29494296-f5e8-4f29-8123-83f487cace05" (UID: "29494296-f5e8-4f29-8123-83f487cace05"). InnerVolumeSpecName "kube-api-access-dc9qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.959393 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c41298-c7b2-4ea5-b628-ac1149cfa7da-kube-api-access-lsq7d" (OuterVolumeSpecName: "kube-api-access-lsq7d") pod "29c41298-c7b2-4ea5-b628-ac1149cfa7da" (UID: "29c41298-c7b2-4ea5-b628-ac1149cfa7da"). InnerVolumeSpecName "kube-api-access-lsq7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:02 crc kubenswrapper[4725]: I0227 06:34:02.993774 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-config-data" (OuterVolumeSpecName: "config-data") pod "29c41298-c7b2-4ea5-b628-ac1149cfa7da" (UID: "29c41298-c7b2-4ea5-b628-ac1149cfa7da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.016450 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.016552 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.024326 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29494296-f5e8-4f29-8123-83f487cace05" (UID: "29494296-f5e8-4f29-8123-83f487cace05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.026363 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-config" (OuterVolumeSpecName: "config") pod "29494296-f5e8-4f29-8123-83f487cace05" (UID: "29494296-f5e8-4f29-8123-83f487cace05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.034497 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29494296-f5e8-4f29-8123-83f487cace05" (UID: "29494296-f5e8-4f29-8123-83f487cace05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.042493 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29494296-f5e8-4f29-8123-83f487cace05" (UID: "29494296-f5e8-4f29-8123-83f487cace05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.048160 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29494296-f5e8-4f29-8123-83f487cace05" (UID: "29494296-f5e8-4f29-8123-83f487cace05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055017 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055046 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc9qs\" (UniqueName: \"kubernetes.io/projected/29494296-f5e8-4f29-8123-83f487cace05-kube-api-access-dc9qs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055060 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055072 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055107 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055117 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055125 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055134 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsq7d\" (UniqueName: \"kubernetes.io/projected/29c41298-c7b2-4ea5-b628-ac1149cfa7da-kube-api-access-lsq7d\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.055144 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29494296-f5e8-4f29-8123-83f487cace05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.064685 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29c41298-c7b2-4ea5-b628-ac1149cfa7da" (UID: "29c41298-c7b2-4ea5-b628-ac1149cfa7da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.156374 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c41298-c7b2-4ea5-b628-ac1149cfa7da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.305627 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g7bnn" event={"ID":"29c41298-c7b2-4ea5-b628-ac1149cfa7da","Type":"ContainerDied","Data":"9865cadecf32df7a844a0cb3ce9c51e48a11ca6031ffcc370df1f331a1fcc34f"} Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.305648 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g7bnn" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.305676 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9865cadecf32df7a844a0cb3ce9c51e48a11ca6031ffcc370df1f331a1fcc34f" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.307464 4725 generic.go:334] "Generic (PLEG): container finished" podID="4b093247-2faf-4e4f-9164-24ac3654ca46" containerID="555a6cc9bddc3d9b22d43b96773f9ca7dea7bf645367ed4567bfe4774e49e6ff" exitCode=0 Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.307536 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536234-868m6" event={"ID":"4b093247-2faf-4e4f-9164-24ac3654ca46","Type":"ContainerDied","Data":"555a6cc9bddc3d9b22d43b96773f9ca7dea7bf645367ed4567bfe4774e49e6ff"} Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.310728 4725 generic.go:334] "Generic (PLEG): container finished" podID="29494296-f5e8-4f29-8123-83f487cace05" containerID="378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e" exitCode=0 Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.310777 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.310801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" event={"ID":"29494296-f5e8-4f29-8123-83f487cace05","Type":"ContainerDied","Data":"378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e"} Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.310884 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-674d64fdcf-wfmcl" event={"ID":"29494296-f5e8-4f29-8123-83f487cace05","Type":"ContainerDied","Data":"585d05a9a25e5d36fc17351a0803ed32fa3d5944f388723cb27e87560c3a03d7"} Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.310923 4725 scope.go:117] "RemoveContainer" containerID="378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.366642 4725 scope.go:117] "RemoveContainer" containerID="9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.448808 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-674d64fdcf-wfmcl"] Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.490796 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-674d64fdcf-wfmcl"] Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.497491 4725 scope.go:117] "RemoveContainer" containerID="378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e" Feb 27 06:34:03 crc kubenswrapper[4725]: E0227 06:34:03.501430 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e\": container with ID starting with 378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e not found: ID does not exist" containerID="378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.501475 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e"} err="failed to get container status \"378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e\": rpc error: code = NotFound desc = could not find container \"378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e\": container with ID starting with 378758b022063e3cc92d0cc910ae2df2f494f0eadbb7e27ac6603fb8e2a5ed3e not found: ID does not exist" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.501501 4725 scope.go:117] "RemoveContainer" containerID="9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a" Feb 27 06:34:03 crc kubenswrapper[4725]: E0227 06:34:03.505396 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a\": container with ID starting with 9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a not found: ID does not exist" containerID="9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.505433 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a"} err="failed to get container status \"9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a\": rpc error: code = NotFound desc = could not find container \"9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a\": container with ID starting with 9ec50fc9f3b87ec1f3a499240d46845c8ee92db97b6fbaf34dc56298bca98d5a not found: ID does not exist" Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.575024 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.575244 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-log" containerID="cri-o://482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6" gracePeriod=30 Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.575351 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-api" containerID="cri-o://9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b" gracePeriod=30 Feb 27 06:34:03 crc kubenswrapper[4725]: I0227 06:34:03.612688 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.263153 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29494296-f5e8-4f29-8123-83f487cace05" path="/var/lib/kubelet/pods/29494296-f5e8-4f29-8123-83f487cace05/volumes" Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.327133 4725 generic.go:334] "Generic (PLEG): container finished" podID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerID="482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6" exitCode=143 Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.327224 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80307e35-9ed6-44cb-81e6-157d907abb4d","Type":"ContainerDied","Data":"482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6"} Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.331390 4725 generic.go:334] "Generic (PLEG): container finished" podID="fd6a75bb-19e2-4fa9-894c-a89271aa9c50" containerID="334dee8c7e9b066f350f06f9e395615502274e2babcb569fd49c027877e81378" exitCode=0 Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.331546 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" event={"ID":"fd6a75bb-19e2-4fa9-894c-a89271aa9c50","Type":"ContainerDied","Data":"334dee8c7e9b066f350f06f9e395615502274e2babcb569fd49c027877e81378"} Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.331774 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0b360a03-f3d8-4111-a292-7c6d843c5897" containerName="nova-scheduler-scheduler" containerID="cri-o://09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" gracePeriod=30 Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.801673 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536234-868m6" Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.890524 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrlq2\" (UniqueName: \"kubernetes.io/projected/4b093247-2faf-4e4f-9164-24ac3654ca46-kube-api-access-hrlq2\") pod \"4b093247-2faf-4e4f-9164-24ac3654ca46\" (UID: \"4b093247-2faf-4e4f-9164-24ac3654ca46\") " Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.905624 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b093247-2faf-4e4f-9164-24ac3654ca46-kube-api-access-hrlq2" (OuterVolumeSpecName: "kube-api-access-hrlq2") pod "4b093247-2faf-4e4f-9164-24ac3654ca46" (UID: "4b093247-2faf-4e4f-9164-24ac3654ca46"). InnerVolumeSpecName "kube-api-access-hrlq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:04 crc kubenswrapper[4725]: I0227 06:34:04.992503 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrlq2\" (UniqueName: \"kubernetes.io/projected/4b093247-2faf-4e4f-9164-24ac3654ca46-kube-api-access-hrlq2\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.354744 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536234-868m6" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.354832 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536234-868m6" event={"ID":"4b093247-2faf-4e4f-9164-24ac3654ca46","Type":"ContainerDied","Data":"b715d7419b958883d2781ada968719a048fd02220091ea56c178d9711c49a24a"} Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.354898 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b715d7419b958883d2781ada968719a048fd02220091ea56c178d9711c49a24a" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.747587 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.807383 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-config-data\") pod \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.807574 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-scripts\") pod \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.807616 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-combined-ca-bundle\") pod \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.807640 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trh4x\" (UniqueName: \"kubernetes.io/projected/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-kube-api-access-trh4x\") pod \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\" (UID: \"fd6a75bb-19e2-4fa9-894c-a89271aa9c50\") " Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.812230 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-kube-api-access-trh4x" (OuterVolumeSpecName: "kube-api-access-trh4x") pod "fd6a75bb-19e2-4fa9-894c-a89271aa9c50" (UID: "fd6a75bb-19e2-4fa9-894c-a89271aa9c50"). InnerVolumeSpecName "kube-api-access-trh4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.814336 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-scripts" (OuterVolumeSpecName: "scripts") pod "fd6a75bb-19e2-4fa9-894c-a89271aa9c50" (UID: "fd6a75bb-19e2-4fa9-894c-a89271aa9c50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.853205 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd6a75bb-19e2-4fa9-894c-a89271aa9c50" (UID: "fd6a75bb-19e2-4fa9-894c-a89271aa9c50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.865716 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-config-data" (OuterVolumeSpecName: "config-data") pod "fd6a75bb-19e2-4fa9-894c-a89271aa9c50" (UID: "fd6a75bb-19e2-4fa9-894c-a89271aa9c50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.888343 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536228-jrqtv"] Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.894470 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536228-jrqtv"] Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.910161 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.910302 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.910412 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trh4x\" (UniqueName: \"kubernetes.io/projected/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-kube-api-access-trh4x\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:05 crc kubenswrapper[4725]: I0227 06:34:05.910484 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6a75bb-19e2-4fa9-894c-a89271aa9c50-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.264037 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0efcde2-60c3-4d14-bec9-056e06640cc6" path="/var/lib/kubelet/pods/c0efcde2-60c3-4d14-bec9-056e06640cc6/volumes" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.373276 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" event={"ID":"fd6a75bb-19e2-4fa9-894c-a89271aa9c50","Type":"ContainerDied","Data":"d907907b1982923cf68eab30410bdfb65d53d59eb0a07e6357f10e769c8098c1"} Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.373348 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d907907b1982923cf68eab30410bdfb65d53d59eb0a07e6357f10e769c8098c1" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.373406 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dhc4" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.487312 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 06:34:06 crc kubenswrapper[4725]: E0227 06:34:06.487733 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29494296-f5e8-4f29-8123-83f487cace05" containerName="dnsmasq-dns" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.487750 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="29494296-f5e8-4f29-8123-83f487cace05" containerName="dnsmasq-dns" Feb 27 06:34:06 crc kubenswrapper[4725]: E0227 06:34:06.487759 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c41298-c7b2-4ea5-b628-ac1149cfa7da" containerName="nova-manage" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.487765 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c41298-c7b2-4ea5-b628-ac1149cfa7da" containerName="nova-manage" Feb 27 06:34:06 crc kubenswrapper[4725]: E0227 06:34:06.487777 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29494296-f5e8-4f29-8123-83f487cace05" containerName="init" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.487784 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="29494296-f5e8-4f29-8123-83f487cace05" containerName="init" Feb 27 06:34:06 crc kubenswrapper[4725]: E0227 06:34:06.487797 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b093247-2faf-4e4f-9164-24ac3654ca46" containerName="oc" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.487803 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b093247-2faf-4e4f-9164-24ac3654ca46" containerName="oc" Feb 27 06:34:06 crc kubenswrapper[4725]: E0227 06:34:06.487814 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6a75bb-19e2-4fa9-894c-a89271aa9c50" containerName="nova-cell1-conductor-db-sync" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.487820 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6a75bb-19e2-4fa9-894c-a89271aa9c50" containerName="nova-cell1-conductor-db-sync" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.488004 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="29494296-f5e8-4f29-8123-83f487cace05" containerName="dnsmasq-dns" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.488020 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6a75bb-19e2-4fa9-894c-a89271aa9c50" containerName="nova-cell1-conductor-db-sync" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.488037 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c41298-c7b2-4ea5-b628-ac1149cfa7da" containerName="nova-manage" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.488044 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b093247-2faf-4e4f-9164-24ac3654ca46" containerName="oc" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.488686 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.490916 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.534220 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.625907 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ef4822-4a88-4b23-8c61-03d89105d848-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.625972 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ef4822-4a88-4b23-8c61-03d89105d848-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.626006 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t8x6\" (UniqueName: \"kubernetes.io/projected/25ef4822-4a88-4b23-8c61-03d89105d848-kube-api-access-6t8x6\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.727426 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ef4822-4a88-4b23-8c61-03d89105d848-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.727477 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ef4822-4a88-4b23-8c61-03d89105d848-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.727516 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t8x6\" (UniqueName: \"kubernetes.io/projected/25ef4822-4a88-4b23-8c61-03d89105d848-kube-api-access-6t8x6\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.735810 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ef4822-4a88-4b23-8c61-03d89105d848-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.736022 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ef4822-4a88-4b23-8c61-03d89105d848-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.749702 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t8x6\" (UniqueName: \"kubernetes.io/projected/25ef4822-4a88-4b23-8c61-03d89105d848-kube-api-access-6t8x6\") pod \"nova-cell1-conductor-0\" (UID: \"25ef4822-4a88-4b23-8c61-03d89105d848\") " pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:06 crc kubenswrapper[4725]: I0227 06:34:06.815985 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:07 crc kubenswrapper[4725]: E0227 06:34:07.059560 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 06:34:07 crc kubenswrapper[4725]: E0227 06:34:07.064879 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 06:34:07 crc kubenswrapper[4725]: E0227 06:34:07.070137 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 06:34:07 crc kubenswrapper[4725]: E0227 06:34:07.070199 4725 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0b360a03-f3d8-4111-a292-7c6d843c5897" containerName="nova-scheduler-scheduler" Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.302183 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 06:34:07 crc kubenswrapper[4725]: W0227 06:34:07.311854 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ef4822_4a88_4b23_8c61_03d89105d848.slice/crio-fd4ab42ddddea9d6bde4c841378956bab09a18f264ce4ff2ef95d19ba05841fc WatchSource:0}: Error finding container fd4ab42ddddea9d6bde4c841378956bab09a18f264ce4ff2ef95d19ba05841fc: Status 404 returned error can't find the container with id fd4ab42ddddea9d6bde4c841378956bab09a18f264ce4ff2ef95d19ba05841fc Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.386816 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"25ef4822-4a88-4b23-8c61-03d89105d848","Type":"ContainerStarted","Data":"fd4ab42ddddea9d6bde4c841378956bab09a18f264ce4ff2ef95d19ba05841fc"} Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.857662 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.951481 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-combined-ca-bundle\") pod \"80307e35-9ed6-44cb-81e6-157d907abb4d\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.951665 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-config-data\") pod \"80307e35-9ed6-44cb-81e6-157d907abb4d\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.952618 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76zmh\" (UniqueName: \"kubernetes.io/projected/80307e35-9ed6-44cb-81e6-157d907abb4d-kube-api-access-76zmh\") pod \"80307e35-9ed6-44cb-81e6-157d907abb4d\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.952928 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80307e35-9ed6-44cb-81e6-157d907abb4d-logs\") pod \"80307e35-9ed6-44cb-81e6-157d907abb4d\" (UID: \"80307e35-9ed6-44cb-81e6-157d907abb4d\") " Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.953538 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80307e35-9ed6-44cb-81e6-157d907abb4d-logs" (OuterVolumeSpecName: "logs") pod "80307e35-9ed6-44cb-81e6-157d907abb4d" (UID: "80307e35-9ed6-44cb-81e6-157d907abb4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.953762 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80307e35-9ed6-44cb-81e6-157d907abb4d-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.958908 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80307e35-9ed6-44cb-81e6-157d907abb4d-kube-api-access-76zmh" (OuterVolumeSpecName: "kube-api-access-76zmh") pod "80307e35-9ed6-44cb-81e6-157d907abb4d" (UID: "80307e35-9ed6-44cb-81e6-157d907abb4d"). InnerVolumeSpecName "kube-api-access-76zmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:07 crc kubenswrapper[4725]: I0227 06:34:07.990857 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-config-data" (OuterVolumeSpecName: "config-data") pod "80307e35-9ed6-44cb-81e6-157d907abb4d" (UID: "80307e35-9ed6-44cb-81e6-157d907abb4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.010811 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80307e35-9ed6-44cb-81e6-157d907abb4d" (UID: "80307e35-9ed6-44cb-81e6-157d907abb4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.056266 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.056338 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80307e35-9ed6-44cb-81e6-157d907abb4d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.056359 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76zmh\" (UniqueName: \"kubernetes.io/projected/80307e35-9ed6-44cb-81e6-157d907abb4d-kube-api-access-76zmh\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.378607 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.398345 4725 generic.go:334] "Generic (PLEG): container finished" podID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerID="9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b" exitCode=0 Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.398419 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80307e35-9ed6-44cb-81e6-157d907abb4d","Type":"ContainerDied","Data":"9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b"} Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.398440 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.398502 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80307e35-9ed6-44cb-81e6-157d907abb4d","Type":"ContainerDied","Data":"ba580f9be45255970955eb03b05df609f433ea0cbfb973d5e374d1cbe932fc11"} Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.398584 4725 scope.go:117] "RemoveContainer" containerID="9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.402702 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"25ef4822-4a88-4b23-8c61-03d89105d848","Type":"ContainerStarted","Data":"9eeede93f510678e12e6449b5dfb7410f79e4641ecf805d431bf06e09da68e69"} Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.403163 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.456439 4725 scope.go:117] "RemoveContainer" containerID="482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.477852 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.477826952 podStartE2EDuration="2.477826952s" podCreationTimestamp="2026-02-27 06:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:08.440000666 +0000 UTC m=+1426.902621265" watchObservedRunningTime="2026-02-27 06:34:08.477826952 +0000 UTC m=+1426.940447531" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.487618 4725 scope.go:117] "RemoveContainer" containerID="9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b" Feb 27 06:34:08 crc kubenswrapper[4725]: E0227 06:34:08.491186 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b\": container with ID starting with 9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b not found: ID does not exist" containerID="9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.491247 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b"} err="failed to get container status \"9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b\": rpc error: code = NotFound desc = could not find container \"9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b\": container with ID starting with 9c19d69318c6effc94233b8cfad62a1c733a06d3af92d98c390337450a34fb5b not found: ID does not exist" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.491302 4725 scope.go:117] "RemoveContainer" containerID="482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6" Feb 27 06:34:08 crc kubenswrapper[4725]: E0227 06:34:08.493824 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6\": container with ID starting with 482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6 not found: ID does not exist" containerID="482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.493889 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6"} err="failed to get container status \"482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6\": rpc error: code = NotFound desc = could not find container \"482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6\": container with ID starting with 482f04ed641a7b226176e18bba6b1ec522dc44d0043d163410656b80c66f4ee6 not found: ID does not exist" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.544848 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.558150 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.567592 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:08 crc kubenswrapper[4725]: E0227 06:34:08.568022 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-api" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.568039 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-api" Feb 27 06:34:08 crc kubenswrapper[4725]: E0227 06:34:08.568070 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-log" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.568077 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-log" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.568245 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-log" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.568262 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" containerName="nova-api-api" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.569308 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.572994 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.578259 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.676353 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvdc4\" (UniqueName: \"kubernetes.io/projected/cf5c1d61-206a-4731-90d7-19755263893a-kube-api-access-nvdc4\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.676467 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.676524 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-config-data\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.676572 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5c1d61-206a-4731-90d7-19755263893a-logs\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.779148 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvdc4\" (UniqueName: \"kubernetes.io/projected/cf5c1d61-206a-4731-90d7-19755263893a-kube-api-access-nvdc4\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.779344 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.779410 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-config-data\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.779459 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5c1d61-206a-4731-90d7-19755263893a-logs\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.780022 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5c1d61-206a-4731-90d7-19755263893a-logs\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.789521 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-config-data\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.795349 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.812686 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvdc4\" (UniqueName: \"kubernetes.io/projected/cf5c1d61-206a-4731-90d7-19755263893a-kube-api-access-nvdc4\") pod \"nova-api-0\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " pod="openstack/nova-api-0" Feb 27 06:34:08 crc kubenswrapper[4725]: I0227 06:34:08.912988 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.179862 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.289381 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-combined-ca-bundle\") pod \"0b360a03-f3d8-4111-a292-7c6d843c5897\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.289493 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnq5b\" (UniqueName: \"kubernetes.io/projected/0b360a03-f3d8-4111-a292-7c6d843c5897-kube-api-access-wnq5b\") pod \"0b360a03-f3d8-4111-a292-7c6d843c5897\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.290095 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-config-data\") pod \"0b360a03-f3d8-4111-a292-7c6d843c5897\" (UID: \"0b360a03-f3d8-4111-a292-7c6d843c5897\") " Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.301182 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b360a03-f3d8-4111-a292-7c6d843c5897-kube-api-access-wnq5b" (OuterVolumeSpecName: "kube-api-access-wnq5b") pod "0b360a03-f3d8-4111-a292-7c6d843c5897" (UID: "0b360a03-f3d8-4111-a292-7c6d843c5897"). InnerVolumeSpecName "kube-api-access-wnq5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.322805 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b360a03-f3d8-4111-a292-7c6d843c5897" (UID: "0b360a03-f3d8-4111-a292-7c6d843c5897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.323656 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-config-data" (OuterVolumeSpecName: "config-data") pod "0b360a03-f3d8-4111-a292-7c6d843c5897" (UID: "0b360a03-f3d8-4111-a292-7c6d843c5897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.392548 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnq5b\" (UniqueName: \"kubernetes.io/projected/0b360a03-f3d8-4111-a292-7c6d843c5897-kube-api-access-wnq5b\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.392840 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.392850 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b360a03-f3d8-4111-a292-7c6d843c5897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.415986 4725 generic.go:334] "Generic (PLEG): container finished" podID="0b360a03-f3d8-4111-a292-7c6d843c5897" containerID="09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" exitCode=0 Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.416048 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b360a03-f3d8-4111-a292-7c6d843c5897","Type":"ContainerDied","Data":"09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4"} Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.416073 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b360a03-f3d8-4111-a292-7c6d843c5897","Type":"ContainerDied","Data":"6d166bb84c9b855e0b80e4cfd7a722ac72f412f0c9eedfad6283192cf1cf956f"} Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.416088 4725 scope.go:117] "RemoveContainer" containerID="09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.416196 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.453313 4725 scope.go:117] "RemoveContainer" containerID="09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" Feb 27 06:34:09 crc kubenswrapper[4725]: E0227 06:34:09.454620 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4\": container with ID starting with 09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4 not found: ID does not exist" containerID="09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.454670 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4"} err="failed to get container status \"09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4\": rpc error: code = NotFound desc = could not find container \"09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4\": container with ID starting with 09dd3be29ddb9a9d9ba29c98dc2bdc2702149610f243ee56fbd371af2de085a4 not found: ID does not exist" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.473651 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.483423 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.495338 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:09 crc kubenswrapper[4725]: E0227 06:34:09.495756 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b360a03-f3d8-4111-a292-7c6d843c5897" containerName="nova-scheduler-scheduler" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.495770 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b360a03-f3d8-4111-a292-7c6d843c5897" containerName="nova-scheduler-scheduler" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.495974 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b360a03-f3d8-4111-a292-7c6d843c5897" containerName="nova-scheduler-scheduler" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.496609 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.499837 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.509603 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:09 crc kubenswrapper[4725]: W0227 06:34:09.560018 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5c1d61_206a_4731_90d7_19755263893a.slice/crio-356cad0bd62035b13a23a0fe5d7494fbe108efb1a20d8ef87518dc23735927fa WatchSource:0}: Error finding container 356cad0bd62035b13a23a0fe5d7494fbe108efb1a20d8ef87518dc23735927fa: Status 404 returned error can't find the container with id 356cad0bd62035b13a23a0fe5d7494fbe108efb1a20d8ef87518dc23735927fa Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.566526 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.596541 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-config-data\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.596637 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktlr5\" (UniqueName: \"kubernetes.io/projected/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-kube-api-access-ktlr5\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.596782 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.698538 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.698723 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-config-data\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.698810 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktlr5\" (UniqueName: \"kubernetes.io/projected/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-kube-api-access-ktlr5\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.703540 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.703594 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-config-data\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.722731 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktlr5\" (UniqueName: \"kubernetes.io/projected/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-kube-api-access-ktlr5\") pod \"nova-scheduler-0\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:09 crc kubenswrapper[4725]: I0227 06:34:09.831754 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:10 crc kubenswrapper[4725]: I0227 06:34:10.264143 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b360a03-f3d8-4111-a292-7c6d843c5897" path="/var/lib/kubelet/pods/0b360a03-f3d8-4111-a292-7c6d843c5897/volumes" Feb 27 06:34:10 crc kubenswrapper[4725]: I0227 06:34:10.265077 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80307e35-9ed6-44cb-81e6-157d907abb4d" path="/var/lib/kubelet/pods/80307e35-9ed6-44cb-81e6-157d907abb4d/volumes" Feb 27 06:34:10 crc kubenswrapper[4725]: I0227 06:34:10.438360 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf5c1d61-206a-4731-90d7-19755263893a","Type":"ContainerStarted","Data":"7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f"} Feb 27 06:34:10 crc kubenswrapper[4725]: I0227 06:34:10.438400 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf5c1d61-206a-4731-90d7-19755263893a","Type":"ContainerStarted","Data":"5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253"} Feb 27 06:34:10 crc kubenswrapper[4725]: I0227 06:34:10.438410 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf5c1d61-206a-4731-90d7-19755263893a","Type":"ContainerStarted","Data":"356cad0bd62035b13a23a0fe5d7494fbe108efb1a20d8ef87518dc23735927fa"} Feb 27 06:34:10 crc kubenswrapper[4725]: I0227 06:34:10.459436 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.459419494 podStartE2EDuration="2.459419494s" podCreationTimestamp="2026-02-27 06:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:10.454130335 +0000 UTC m=+1428.916750904" watchObservedRunningTime="2026-02-27 06:34:10.459419494 +0000 UTC m=+1428.922040063" Feb 27 06:34:10 crc kubenswrapper[4725]: I0227 06:34:10.488296 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:10 crc kubenswrapper[4725]: W0227 06:34:10.489463 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0a6e56c_1da8_4476_b8c9_f727c831c6c6.slice/crio-f6494d3e38c77eda85279e0e38e012be5b0a33617671004af32d2a0004c1e245 WatchSource:0}: Error finding container f6494d3e38c77eda85279e0e38e012be5b0a33617671004af32d2a0004c1e245: Status 404 returned error can't find the container with id f6494d3e38c77eda85279e0e38e012be5b0a33617671004af32d2a0004c1e245 Feb 27 06:34:11 crc kubenswrapper[4725]: I0227 06:34:11.447598 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a6e56c-1da8-4476-b8c9-f727c831c6c6","Type":"ContainerStarted","Data":"42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce"} Feb 27 06:34:11 crc kubenswrapper[4725]: I0227 06:34:11.448173 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a6e56c-1da8-4476-b8c9-f727c831c6c6","Type":"ContainerStarted","Data":"f6494d3e38c77eda85279e0e38e012be5b0a33617671004af32d2a0004c1e245"} Feb 27 06:34:11 crc kubenswrapper[4725]: I0227 06:34:11.473266 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.473250724 podStartE2EDuration="2.473250724s" podCreationTimestamp="2026-02-27 06:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:11.468306195 +0000 UTC m=+1429.930926774" watchObservedRunningTime="2026-02-27 06:34:11.473250724 +0000 UTC m=+1429.935871293" Feb 27 06:34:12 crc kubenswrapper[4725]: I0227 06:34:12.928416 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:34:12 crc kubenswrapper[4725]: I0227 06:34:12.928884 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" containerName="kube-state-metrics" containerID="cri-o://eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2" gracePeriod=30 Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.447032 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.478752 4725 generic.go:334] "Generic (PLEG): container finished" podID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" containerID="eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2" exitCode=2 Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.478802 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b57cf7f3-cfa9-403a-8c71-84b46d6dd189","Type":"ContainerDied","Data":"eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2"} Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.478830 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b57cf7f3-cfa9-403a-8c71-84b46d6dd189","Type":"ContainerDied","Data":"c6e15fc294d81245870a8b9bdbf6c29e0b1b46827ecf6db37201088db6f3363a"} Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.478869 4725 scope.go:117] "RemoveContainer" containerID="eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.479013 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.503744 4725 scope.go:117] "RemoveContainer" containerID="eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2" Feb 27 06:34:13 crc kubenswrapper[4725]: E0227 06:34:13.504280 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2\": container with ID starting with eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2 not found: ID does not exist" containerID="eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.504331 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2"} err="failed to get container status \"eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2\": rpc error: code = NotFound desc = could not find container \"eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2\": container with ID starting with eb3a66b25976cdffc5812e430d8eaf7e4e0246d7482e9aea4c769931d674b2f2 not found: ID does not exist" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.579741 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-474tq\" (UniqueName: \"kubernetes.io/projected/b57cf7f3-cfa9-403a-8c71-84b46d6dd189-kube-api-access-474tq\") pod \"b57cf7f3-cfa9-403a-8c71-84b46d6dd189\" (UID: \"b57cf7f3-cfa9-403a-8c71-84b46d6dd189\") " Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.585394 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57cf7f3-cfa9-403a-8c71-84b46d6dd189-kube-api-access-474tq" (OuterVolumeSpecName: "kube-api-access-474tq") pod "b57cf7f3-cfa9-403a-8c71-84b46d6dd189" (UID: "b57cf7f3-cfa9-403a-8c71-84b46d6dd189"). InnerVolumeSpecName "kube-api-access-474tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.681982 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-474tq\" (UniqueName: \"kubernetes.io/projected/b57cf7f3-cfa9-403a-8c71-84b46d6dd189-kube-api-access-474tq\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.818731 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.835946 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.851811 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:34:13 crc kubenswrapper[4725]: E0227 06:34:13.852429 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" containerName="kube-state-metrics" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.852492 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" containerName="kube-state-metrics" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.852717 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" containerName="kube-state-metrics" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.853443 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.856054 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.857479 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.871437 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.988821 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.988875 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgf6f\" (UniqueName: \"kubernetes.io/projected/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-api-access-pgf6f\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.988961 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:13 crc kubenswrapper[4725]: I0227 06:34:13.989062 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.091315 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.091501 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.091538 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgf6f\" (UniqueName: \"kubernetes.io/projected/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-api-access-pgf6f\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.091586 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.096068 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.096838 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.097360 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.118422 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgf6f\" (UniqueName: \"kubernetes.io/projected/9d1d1822-20db-4d79-9ba2-0746292596c6-kube-api-access-pgf6f\") pod \"kube-state-metrics-0\" (UID: \"9d1d1822-20db-4d79-9ba2-0746292596c6\") " pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.190107 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.280799 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57cf7f3-cfa9-403a-8c71-84b46d6dd189" path="/var/lib/kubelet/pods/b57cf7f3-cfa9-403a-8c71-84b46d6dd189/volumes" Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.681162 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.805689 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.806154 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-central-agent" containerID="cri-o://db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7" gracePeriod=30 Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.806263 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="sg-core" containerID="cri-o://9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b" gracePeriod=30 Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.806323 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-notification-agent" containerID="cri-o://3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356" gracePeriod=30 Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.806278 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="proxy-httpd" containerID="cri-o://3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc" gracePeriod=30 Feb 27 06:34:14 crc kubenswrapper[4725]: I0227 06:34:14.832826 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 06:34:15 crc kubenswrapper[4725]: I0227 06:34:15.500331 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d1d1822-20db-4d79-9ba2-0746292596c6","Type":"ContainerStarted","Data":"cba9cb3c77ad9215c1679da5b28ef5b84e9dbc66e0054b4082d68145935e74f7"} Feb 27 06:34:15 crc kubenswrapper[4725]: I0227 06:34:15.503679 4725 generic.go:334] "Generic (PLEG): container finished" podID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerID="3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc" exitCode=0 Feb 27 06:34:15 crc kubenswrapper[4725]: I0227 06:34:15.503705 4725 generic.go:334] "Generic (PLEG): container finished" podID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerID="9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b" exitCode=2 Feb 27 06:34:15 crc kubenswrapper[4725]: I0227 06:34:15.503703 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerDied","Data":"3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc"} Feb 27 06:34:15 crc kubenswrapper[4725]: I0227 06:34:15.503721 4725 generic.go:334] "Generic (PLEG): container finished" podID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerID="db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7" exitCode=0 Feb 27 06:34:15 crc kubenswrapper[4725]: I0227 06:34:15.503729 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerDied","Data":"9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b"} Feb 27 06:34:15 crc kubenswrapper[4725]: I0227 06:34:15.503738 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerDied","Data":"db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7"} Feb 27 06:34:16 crc kubenswrapper[4725]: I0227 06:34:16.516627 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d1d1822-20db-4d79-9ba2-0746292596c6","Type":"ContainerStarted","Data":"6a148beeea6d4046102b455286c58d6719247a99f8ace188bf42b238e96bd99a"} Feb 27 06:34:16 crc kubenswrapper[4725]: I0227 06:34:16.518160 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 06:34:16 crc kubenswrapper[4725]: I0227 06:34:16.537395 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.167656756 podStartE2EDuration="3.537369701s" podCreationTimestamp="2026-02-27 06:34:13 +0000 UTC" firstStartedPulling="2026-02-27 06:34:14.685795622 +0000 UTC m=+1433.148416181" lastFinishedPulling="2026-02-27 06:34:15.055508557 +0000 UTC m=+1433.518129126" observedRunningTime="2026-02-27 06:34:16.533840681 +0000 UTC m=+1434.996461270" watchObservedRunningTime="2026-02-27 06:34:16.537369701 +0000 UTC m=+1434.999990310" Feb 27 06:34:16 crc kubenswrapper[4725]: I0227 06:34:16.850417 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 27 06:34:18 crc kubenswrapper[4725]: I0227 06:34:18.913845 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:34:18 crc kubenswrapper[4725]: I0227 06:34:18.914119 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:34:19 crc kubenswrapper[4725]: I0227 06:34:19.832181 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 06:34:19 crc kubenswrapper[4725]: I0227 06:34:19.866968 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 06:34:19 crc kubenswrapper[4725]: I0227 06:34:19.997529 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:19 crc kubenswrapper[4725]: I0227 06:34:19.997558 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:20 crc kubenswrapper[4725]: I0227 06:34:20.618993 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.294977 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.442133 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsjc8\" (UniqueName: \"kubernetes.io/projected/970a7b0c-8b62-418e-8317-b408eb9e70a0-kube-api-access-wsjc8\") pod \"970a7b0c-8b62-418e-8317-b408eb9e70a0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.442179 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-run-httpd\") pod \"970a7b0c-8b62-418e-8317-b408eb9e70a0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.442215 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-sg-core-conf-yaml\") pod \"970a7b0c-8b62-418e-8317-b408eb9e70a0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.442247 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-scripts\") pod \"970a7b0c-8b62-418e-8317-b408eb9e70a0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.442299 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-combined-ca-bundle\") pod \"970a7b0c-8b62-418e-8317-b408eb9e70a0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.442362 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-log-httpd\") pod \"970a7b0c-8b62-418e-8317-b408eb9e70a0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.442396 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-config-data\") pod \"970a7b0c-8b62-418e-8317-b408eb9e70a0\" (UID: \"970a7b0c-8b62-418e-8317-b408eb9e70a0\") " Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.443994 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "970a7b0c-8b62-418e-8317-b408eb9e70a0" (UID: "970a7b0c-8b62-418e-8317-b408eb9e70a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.444054 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "970a7b0c-8b62-418e-8317-b408eb9e70a0" (UID: "970a7b0c-8b62-418e-8317-b408eb9e70a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.453258 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-scripts" (OuterVolumeSpecName: "scripts") pod "970a7b0c-8b62-418e-8317-b408eb9e70a0" (UID: "970a7b0c-8b62-418e-8317-b408eb9e70a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.464497 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/970a7b0c-8b62-418e-8317-b408eb9e70a0-kube-api-access-wsjc8" (OuterVolumeSpecName: "kube-api-access-wsjc8") pod "970a7b0c-8b62-418e-8317-b408eb9e70a0" (UID: "970a7b0c-8b62-418e-8317-b408eb9e70a0"). InnerVolumeSpecName "kube-api-access-wsjc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.529994 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "970a7b0c-8b62-418e-8317-b408eb9e70a0" (UID: "970a7b0c-8b62-418e-8317-b408eb9e70a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.541280 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "970a7b0c-8b62-418e-8317-b408eb9e70a0" (UID: "970a7b0c-8b62-418e-8317-b408eb9e70a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.545337 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsjc8\" (UniqueName: \"kubernetes.io/projected/970a7b0c-8b62-418e-8317-b408eb9e70a0-kube-api-access-wsjc8\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.545366 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.545375 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.545384 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.545393 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.545401 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/970a7b0c-8b62-418e-8317-b408eb9e70a0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.566106 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-config-data" (OuterVolumeSpecName: "config-data") pod "970a7b0c-8b62-418e-8317-b408eb9e70a0" (UID: "970a7b0c-8b62-418e-8317-b408eb9e70a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.567230 4725 generic.go:334] "Generic (PLEG): container finished" podID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerID="3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356" exitCode=0 Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.568058 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.568364 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerDied","Data":"3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356"} Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.568422 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"970a7b0c-8b62-418e-8317-b408eb9e70a0","Type":"ContainerDied","Data":"ac1a9d96f5d6900239ce4eda668a79e4abe90ebe43580b238862ec19e70a8a84"} Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.568442 4725 scope.go:117] "RemoveContainer" containerID="3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.586969 4725 scope.go:117] "RemoveContainer" containerID="9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.610573 4725 scope.go:117] "RemoveContainer" containerID="3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.629986 4725 scope.go:117] "RemoveContainer" containerID="db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.630672 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.644322 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.646896 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/970a7b0c-8b62-418e-8317-b408eb9e70a0-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.663519 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.663953 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="proxy-httpd" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.663967 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="proxy-httpd" Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.663986 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-notification-agent" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.663992 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-notification-agent" Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.664009 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-central-agent" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.664016 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-central-agent" Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.664028 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="sg-core" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.664033 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="sg-core" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.664205 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="sg-core" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.664221 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-central-agent" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.664233 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="ceilometer-notification-agent" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.664253 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" containerName="proxy-httpd" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.664456 4725 scope.go:117] "RemoveContainer" containerID="3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.665920 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.666336 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc\": container with ID starting with 3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc not found: ID does not exist" containerID="3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.666370 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc"} err="failed to get container status \"3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc\": rpc error: code = NotFound desc = could not find container \"3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc\": container with ID starting with 3ff3e6515abe1ee30a3d9b18cc07804a741b0cedf6729acec1ad95b51d2cc6cc not found: ID does not exist" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.666393 4725 scope.go:117] "RemoveContainer" containerID="9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.668939 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.669081 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.669480 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b\": container with ID starting with 9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b not found: ID does not exist" containerID="9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.669502 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b"} err="failed to get container status \"9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b\": rpc error: code = NotFound desc = could not find container \"9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b\": container with ID starting with 9f9389d68b74cc90c5f0ec8568439dd7c85bb178723df80e27735ce46f3b702b not found: ID does not exist" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.669522 4725 scope.go:117] "RemoveContainer" containerID="3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.669667 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.670169 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356\": container with ID starting with 3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356 not found: ID does not exist" containerID="3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.670199 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356"} err="failed to get container status \"3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356\": rpc error: code = NotFound desc = could not find container \"3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356\": container with ID starting with 3c5d05e7420cb304fd1bf4acb9026d0aa513ccffb040dc47cda0ce15e7d04356 not found: ID does not exist" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.670221 4725 scope.go:117] "RemoveContainer" containerID="db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7" Feb 27 06:34:21 crc kubenswrapper[4725]: E0227 06:34:21.672763 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7\": container with ID starting with db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7 not found: ID does not exist" containerID="db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.672824 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7"} err="failed to get container status \"db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7\": rpc error: code = NotFound desc = could not find container \"db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7\": container with ID starting with db4855bb20e8307737f0c84484cea814a01e7aa5a1e535dc511854d234644be7 not found: ID does not exist" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.690572 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750372 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750422 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84qpq\" (UniqueName: \"kubernetes.io/projected/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-kube-api-access-84qpq\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750449 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-config-data\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750470 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-log-httpd\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750547 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-scripts\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750565 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-run-httpd\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.750582 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.853388 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.853449 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84qpq\" (UniqueName: \"kubernetes.io/projected/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-kube-api-access-84qpq\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.853483 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-config-data\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.853696 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-log-httpd\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.853885 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.853896 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-log-httpd\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.854082 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-scripts\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.854141 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-run-httpd\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.854338 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.854484 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-run-httpd\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.857192 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.857717 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.866778 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-scripts\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.867590 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.867673 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-config-data\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:21 crc kubenswrapper[4725]: I0227 06:34:21.877196 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84qpq\" (UniqueName: \"kubernetes.io/projected/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-kube-api-access-84qpq\") pod \"ceilometer-0\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " pod="openstack/ceilometer-0" Feb 27 06:34:22 crc kubenswrapper[4725]: I0227 06:34:22.000935 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:22 crc kubenswrapper[4725]: I0227 06:34:22.277881 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="970a7b0c-8b62-418e-8317-b408eb9e70a0" path="/var/lib/kubelet/pods/970a7b0c-8b62-418e-8317-b408eb9e70a0/volumes" Feb 27 06:34:22 crc kubenswrapper[4725]: I0227 06:34:22.487378 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:22 crc kubenswrapper[4725]: I0227 06:34:22.584428 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerStarted","Data":"3814464ffa5ffa2bcbee481ca1dd58663657b4409c26acbe833e3ab3cabed923"} Feb 27 06:34:23 crc kubenswrapper[4725]: I0227 06:34:23.604273 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerStarted","Data":"dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868"} Feb 27 06:34:23 crc kubenswrapper[4725]: I0227 06:34:23.604617 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerStarted","Data":"285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1"} Feb 27 06:34:24 crc kubenswrapper[4725]: I0227 06:34:24.217777 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 06:34:24 crc kubenswrapper[4725]: I0227 06:34:24.615984 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerStarted","Data":"4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b"} Feb 27 06:34:26 crc kubenswrapper[4725]: I0227 06:34:26.642491 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerStarted","Data":"66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6"} Feb 27 06:34:26 crc kubenswrapper[4725]: I0227 06:34:26.642866 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 06:34:26 crc kubenswrapper[4725]: I0227 06:34:26.683712 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.754916399 podStartE2EDuration="5.683684314s" podCreationTimestamp="2026-02-27 06:34:21 +0000 UTC" firstStartedPulling="2026-02-27 06:34:22.496234412 +0000 UTC m=+1440.958854981" lastFinishedPulling="2026-02-27 06:34:25.425002317 +0000 UTC m=+1443.887622896" observedRunningTime="2026-02-27 06:34:26.676011698 +0000 UTC m=+1445.138632327" watchObservedRunningTime="2026-02-27 06:34:26.683684314 +0000 UTC m=+1445.146304923" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.652432 4725 generic.go:334] "Generic (PLEG): container finished" podID="9d2a9b00-096a-40dc-8477-1fb98996f32a" containerID="9d98baf31598be01014934cf4857b43b2a77e64c204543b5d7a508b969930958" exitCode=137 Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.652555 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d2a9b00-096a-40dc-8477-1fb98996f32a","Type":"ContainerDied","Data":"9d98baf31598be01014934cf4857b43b2a77e64c204543b5d7a508b969930958"} Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.653101 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9d2a9b00-096a-40dc-8477-1fb98996f32a","Type":"ContainerDied","Data":"686ebf5c96ed321ce900ef829e61f96a7a3d17270c109de2eb54a7285dfef76f"} Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.653205 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="686ebf5c96ed321ce900ef829e61f96a7a3d17270c109de2eb54a7285dfef76f" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.655088 4725 generic.go:334] "Generic (PLEG): container finished" podID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerID="829e3fd17516b62cd0b9a2a3bda351e02af1a836238e88f3a7d78e0bc339eb56" exitCode=137 Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.655126 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b20acebe-3d2a-4342-9f32-2b12d70bfd1b","Type":"ContainerDied","Data":"829e3fd17516b62cd0b9a2a3bda351e02af1a836238e88f3a7d78e0bc339eb56"} Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.655169 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b20acebe-3d2a-4342-9f32-2b12d70bfd1b","Type":"ContainerDied","Data":"cdf093296091f994c845935e55c7a083826b351222ae339d8130bcde1a62a6b6"} Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.655184 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf093296091f994c845935e55c7a083826b351222ae339d8130bcde1a62a6b6" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.678818 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.683483 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.698716 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-combined-ca-bundle\") pod \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.698807 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-logs\") pod \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.698855 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-config-data\") pod \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.698939 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shdlb\" (UniqueName: \"kubernetes.io/projected/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-kube-api-access-shdlb\") pod \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\" (UID: \"b20acebe-3d2a-4342-9f32-2b12d70bfd1b\") " Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.700082 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-logs" (OuterVolumeSpecName: "logs") pod "b20acebe-3d2a-4342-9f32-2b12d70bfd1b" (UID: "b20acebe-3d2a-4342-9f32-2b12d70bfd1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.710683 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-kube-api-access-shdlb" (OuterVolumeSpecName: "kube-api-access-shdlb") pod "b20acebe-3d2a-4342-9f32-2b12d70bfd1b" (UID: "b20acebe-3d2a-4342-9f32-2b12d70bfd1b"). InnerVolumeSpecName "kube-api-access-shdlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.746835 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-config-data" (OuterVolumeSpecName: "config-data") pod "b20acebe-3d2a-4342-9f32-2b12d70bfd1b" (UID: "b20acebe-3d2a-4342-9f32-2b12d70bfd1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.765823 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b20acebe-3d2a-4342-9f32-2b12d70bfd1b" (UID: "b20acebe-3d2a-4342-9f32-2b12d70bfd1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.806168 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-combined-ca-bundle\") pod \"9d2a9b00-096a-40dc-8477-1fb98996f32a\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.806211 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9wcl\" (UniqueName: \"kubernetes.io/projected/9d2a9b00-096a-40dc-8477-1fb98996f32a-kube-api-access-k9wcl\") pod \"9d2a9b00-096a-40dc-8477-1fb98996f32a\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.806300 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-config-data\") pod \"9d2a9b00-096a-40dc-8477-1fb98996f32a\" (UID: \"9d2a9b00-096a-40dc-8477-1fb98996f32a\") " Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.806873 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.806893 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.806903 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.806912 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shdlb\" (UniqueName: \"kubernetes.io/projected/b20acebe-3d2a-4342-9f32-2b12d70bfd1b-kube-api-access-shdlb\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.809882 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2a9b00-096a-40dc-8477-1fb98996f32a-kube-api-access-k9wcl" (OuterVolumeSpecName: "kube-api-access-k9wcl") pod "9d2a9b00-096a-40dc-8477-1fb98996f32a" (UID: "9d2a9b00-096a-40dc-8477-1fb98996f32a"). InnerVolumeSpecName "kube-api-access-k9wcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.831431 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d2a9b00-096a-40dc-8477-1fb98996f32a" (UID: "9d2a9b00-096a-40dc-8477-1fb98996f32a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.836923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-config-data" (OuterVolumeSpecName: "config-data") pod "9d2a9b00-096a-40dc-8477-1fb98996f32a" (UID: "9d2a9b00-096a-40dc-8477-1fb98996f32a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.909176 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.909202 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9wcl\" (UniqueName: \"kubernetes.io/projected/9d2a9b00-096a-40dc-8477-1fb98996f32a-kube-api-access-k9wcl\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:27 crc kubenswrapper[4725]: I0227 06:34:27.909211 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a9b00-096a-40dc-8477-1fb98996f32a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.666900 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.667573 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.697949 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.724689 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.740392 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.751139 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.774549 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: E0227 06:34:28.776153 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2a9b00-096a-40dc-8477-1fb98996f32a" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.776183 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2a9b00-096a-40dc-8477-1fb98996f32a" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 06:34:28 crc kubenswrapper[4725]: E0227 06:34:28.776216 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-log" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.776229 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-log" Feb 27 06:34:28 crc kubenswrapper[4725]: E0227 06:34:28.776723 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-metadata" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.776734 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-metadata" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.777442 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-metadata" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.777495 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2a9b00-096a-40dc-8477-1fb98996f32a" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.777510 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" containerName="nova-metadata-log" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.782360 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.799352 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.799529 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.803105 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.805662 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.828528 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.829166 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.829376 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.829684 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw225\" (UniqueName: \"kubernetes.io/projected/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-kube-api-access-hw225\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.829857 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.830024 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.831843 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.832043 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.832150 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.858391 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.920199 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.920776 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.922359 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.926442 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931627 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw225\" (UniqueName: \"kubernetes.io/projected/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-kube-api-access-hw225\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931683 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931709 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931765 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931813 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-config-data\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931840 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931910 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.931969 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.932032 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shvmp\" (UniqueName: \"kubernetes.io/projected/c4222491-f5d6-40be-832a-728651adf29f-kube-api-access-shvmp\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.932074 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4222491-f5d6-40be-832a-728651adf29f-logs\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.938724 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.947039 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.947870 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.948257 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:28 crc kubenswrapper[4725]: I0227 06:34:28.948536 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw225\" (UniqueName: \"kubernetes.io/projected/48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1-kube-api-access-hw225\") pod \"nova-cell1-novncproxy-0\" (UID: \"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.033582 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.033653 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shvmp\" (UniqueName: \"kubernetes.io/projected/c4222491-f5d6-40be-832a-728651adf29f-kube-api-access-shvmp\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.033681 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4222491-f5d6-40be-832a-728651adf29f-logs\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.033757 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.033827 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-config-data\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.035082 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4222491-f5d6-40be-832a-728651adf29f-logs\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.036915 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-config-data\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.037268 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.037786 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.200172 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.238015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shvmp\" (UniqueName: \"kubernetes.io/projected/c4222491-f5d6-40be-832a-728651adf29f-kube-api-access-shvmp\") pod \"nova-metadata-0\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.450102 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.681637 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.707795 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.711208 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.872980 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59749476c-fklwv"] Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.876168 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.893976 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59749476c-fklwv"] Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.950435 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4s6r\" (UniqueName: \"kubernetes.io/projected/c012782a-1d54-4605-9d08-4ffacc6dc1a1-kube-api-access-x4s6r\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.950721 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-config\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.950801 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.950828 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-swift-storage-0\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.950846 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-svc\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.950933 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:29 crc kubenswrapper[4725]: W0227 06:34:29.953407 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4222491_f5d6_40be_832a_728651adf29f.slice/crio-21b4a4efad6c0c730784ad2031ca63baf8e1e529cb1cb62a93db313f5a41d383 WatchSource:0}: Error finding container 21b4a4efad6c0c730784ad2031ca63baf8e1e529cb1cb62a93db313f5a41d383: Status 404 returned error can't find the container with id 21b4a4efad6c0c730784ad2031ca63baf8e1e529cb1cb62a93db313f5a41d383 Feb 27 06:34:29 crc kubenswrapper[4725]: I0227 06:34:29.961221 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.052965 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.053008 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4s6r\" (UniqueName: \"kubernetes.io/projected/c012782a-1d54-4605-9d08-4ffacc6dc1a1-kube-api-access-x4s6r\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.053030 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-config\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.053095 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.053119 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-swift-storage-0\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.053142 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-svc\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.053899 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-svc\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.054469 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.055196 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-config\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.055710 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.056190 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-swift-storage-0\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.077315 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4s6r\" (UniqueName: \"kubernetes.io/projected/c012782a-1d54-4605-9d08-4ffacc6dc1a1-kube-api-access-x4s6r\") pod \"dnsmasq-dns-59749476c-fklwv\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.211833 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.267205 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2a9b00-096a-40dc-8477-1fb98996f32a" path="/var/lib/kubelet/pods/9d2a9b00-096a-40dc-8477-1fb98996f32a/volumes" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.267740 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20acebe-3d2a-4342-9f32-2b12d70bfd1b" path="/var/lib/kubelet/pods/b20acebe-3d2a-4342-9f32-2b12d70bfd1b/volumes" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.513225 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59749476c-fklwv"] Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.689267 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1","Type":"ContainerStarted","Data":"5490ab6261625e9f91a53b7ad75eabc067c04d87b8c431ef3b15e17876698bba"} Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.689332 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1","Type":"ContainerStarted","Data":"a07db3246096b60baee966c5c018a828c05f416ee570badc0f4914d4f6fd6971"} Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.692164 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4222491-f5d6-40be-832a-728651adf29f","Type":"ContainerStarted","Data":"7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5"} Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.692208 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4222491-f5d6-40be-832a-728651adf29f","Type":"ContainerStarted","Data":"65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03"} Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.692221 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4222491-f5d6-40be-832a-728651adf29f","Type":"ContainerStarted","Data":"21b4a4efad6c0c730784ad2031ca63baf8e1e529cb1cb62a93db313f5a41d383"} Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.693895 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59749476c-fklwv" event={"ID":"c012782a-1d54-4605-9d08-4ffacc6dc1a1","Type":"ContainerStarted","Data":"df041be80598e27404afb6b1c66ad838d9051c661fdb37109d4030d31b1b6ecf"} Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.712674 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.712657371 podStartE2EDuration="2.712657371s" podCreationTimestamp="2026-02-27 06:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:30.711357124 +0000 UTC m=+1449.173977703" watchObservedRunningTime="2026-02-27 06:34:30.712657371 +0000 UTC m=+1449.175277940" Feb 27 06:34:30 crc kubenswrapper[4725]: I0227 06:34:30.755255 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.75523387 podStartE2EDuration="2.75523387s" podCreationTimestamp="2026-02-27 06:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:30.73003419 +0000 UTC m=+1449.192654759" watchObservedRunningTime="2026-02-27 06:34:30.75523387 +0000 UTC m=+1449.217854429" Feb 27 06:34:31 crc kubenswrapper[4725]: I0227 06:34:31.704234 4725 generic.go:334] "Generic (PLEG): container finished" podID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerID="d3a618062f19f9c10ae587093c2976f04668bc73273b71bb5784fcf3686f4a81" exitCode=0 Feb 27 06:34:31 crc kubenswrapper[4725]: I0227 06:34:31.704331 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59749476c-fklwv" event={"ID":"c012782a-1d54-4605-9d08-4ffacc6dc1a1","Type":"ContainerDied","Data":"d3a618062f19f9c10ae587093c2976f04668bc73273b71bb5784fcf3686f4a81"} Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.137119 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.137452 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-central-agent" containerID="cri-o://285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1" gracePeriod=30 Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.137541 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-notification-agent" containerID="cri-o://dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868" gracePeriod=30 Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.137557 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="sg-core" containerID="cri-o://4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b" gracePeriod=30 Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.137563 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="proxy-httpd" containerID="cri-o://66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6" gracePeriod=30 Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.721661 4725 generic.go:334] "Generic (PLEG): container finished" podID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerID="66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6" exitCode=0 Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.721692 4725 generic.go:334] "Generic (PLEG): container finished" podID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerID="4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b" exitCode=2 Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.721701 4725 generic.go:334] "Generic (PLEG): container finished" podID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerID="285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1" exitCode=0 Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.721737 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerDied","Data":"66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6"} Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.721760 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerDied","Data":"4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b"} Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.721770 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerDied","Data":"285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1"} Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.723711 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59749476c-fklwv" event={"ID":"c012782a-1d54-4605-9d08-4ffacc6dc1a1","Type":"ContainerStarted","Data":"8595668007a4d363f6370965bf76d4cdb0329ca854fa09cb5feb75235fd4c35a"} Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.723951 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:32 crc kubenswrapper[4725]: I0227 06:34:32.746888 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59749476c-fklwv" podStartSLOduration=3.746872016 podStartE2EDuration="3.746872016s" podCreationTimestamp="2026-02-27 06:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:32.741481334 +0000 UTC m=+1451.204101913" watchObservedRunningTime="2026-02-27 06:34:32.746872016 +0000 UTC m=+1451.209492585" Feb 27 06:34:33 crc kubenswrapper[4725]: I0227 06:34:33.250510 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:33 crc kubenswrapper[4725]: I0227 06:34:33.251093 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-log" containerID="cri-o://5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253" gracePeriod=30 Feb 27 06:34:33 crc kubenswrapper[4725]: I0227 06:34:33.251599 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-api" containerID="cri-o://7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f" gracePeriod=30 Feb 27 06:34:33 crc kubenswrapper[4725]: I0227 06:34:33.749730 4725 generic.go:334] "Generic (PLEG): container finished" podID="cf5c1d61-206a-4731-90d7-19755263893a" containerID="5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253" exitCode=143 Feb 27 06:34:33 crc kubenswrapper[4725]: I0227 06:34:33.749784 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf5c1d61-206a-4731-90d7-19755263893a","Type":"ContainerDied","Data":"5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253"} Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.200461 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.452205 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.453372 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.548248 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.654047 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-combined-ca-bundle\") pod \"cf5c1d61-206a-4731-90d7-19755263893a\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.654105 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-config-data\") pod \"cf5c1d61-206a-4731-90d7-19755263893a\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.654188 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvdc4\" (UniqueName: \"kubernetes.io/projected/cf5c1d61-206a-4731-90d7-19755263893a-kube-api-access-nvdc4\") pod \"cf5c1d61-206a-4731-90d7-19755263893a\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.654278 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5c1d61-206a-4731-90d7-19755263893a-logs\") pod \"cf5c1d61-206a-4731-90d7-19755263893a\" (UID: \"cf5c1d61-206a-4731-90d7-19755263893a\") " Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.655078 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5c1d61-206a-4731-90d7-19755263893a-logs" (OuterVolumeSpecName: "logs") pod "cf5c1d61-206a-4731-90d7-19755263893a" (UID: "cf5c1d61-206a-4731-90d7-19755263893a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.678010 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5c1d61-206a-4731-90d7-19755263893a-kube-api-access-nvdc4" (OuterVolumeSpecName: "kube-api-access-nvdc4") pod "cf5c1d61-206a-4731-90d7-19755263893a" (UID: "cf5c1d61-206a-4731-90d7-19755263893a"). InnerVolumeSpecName "kube-api-access-nvdc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.685876 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf5c1d61-206a-4731-90d7-19755263893a" (UID: "cf5c1d61-206a-4731-90d7-19755263893a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.700587 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-config-data" (OuterVolumeSpecName: "config-data") pod "cf5c1d61-206a-4731-90d7-19755263893a" (UID: "cf5c1d61-206a-4731-90d7-19755263893a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.770677 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.770716 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5c1d61-206a-4731-90d7-19755263893a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.770726 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvdc4\" (UniqueName: \"kubernetes.io/projected/cf5c1d61-206a-4731-90d7-19755263893a-kube-api-access-nvdc4\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.770736 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5c1d61-206a-4731-90d7-19755263893a-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.774273 4725 generic.go:334] "Generic (PLEG): container finished" podID="cf5c1d61-206a-4731-90d7-19755263893a" containerID="7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f" exitCode=0 Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.774393 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf5c1d61-206a-4731-90d7-19755263893a","Type":"ContainerDied","Data":"7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f"} Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.774417 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.774442 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf5c1d61-206a-4731-90d7-19755263893a","Type":"ContainerDied","Data":"356cad0bd62035b13a23a0fe5d7494fbe108efb1a20d8ef87518dc23735927fa"} Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.774463 4725 scope.go:117] "RemoveContainer" containerID="7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.805472 4725 scope.go:117] "RemoveContainer" containerID="5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.813401 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.827500 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.837402 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:34 crc kubenswrapper[4725]: E0227 06:34:34.838043 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-log" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.838069 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-log" Feb 27 06:34:34 crc kubenswrapper[4725]: E0227 06:34:34.838108 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-api" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.838118 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-api" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.839324 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-log" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.839358 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5c1d61-206a-4731-90d7-19755263893a" containerName="nova-api-api" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.840590 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.842783 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.842881 4725 scope.go:117] "RemoveContainer" containerID="7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.844050 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.844816 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 06:34:34 crc kubenswrapper[4725]: E0227 06:34:34.845093 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f\": container with ID starting with 7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f not found: ID does not exist" containerID="7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.845129 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f"} err="failed to get container status \"7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f\": rpc error: code = NotFound desc = could not find container \"7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f\": container with ID starting with 7ab3142de00dfaca0b44c27ae0565404820f2ada771d02a7486e5f2d4177cd9f not found: ID does not exist" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.845153 4725 scope.go:117] "RemoveContainer" containerID="5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253" Feb 27 06:34:34 crc kubenswrapper[4725]: E0227 06:34:34.845815 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253\": container with ID starting with 5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253 not found: ID does not exist" containerID="5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.845842 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253"} err="failed to get container status \"5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253\": rpc error: code = NotFound desc = could not find container \"5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253\": container with ID starting with 5b5a99272fa1e663b9a0fddadb65086b121cd01d41aae8d546f3ce201c683253 not found: ID does not exist" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.850379 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.872497 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-logs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.872557 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.872588 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.872637 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.872677 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-config-data\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.872769 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftm8\" (UniqueName: \"kubernetes.io/projected/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-kube-api-access-6ftm8\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.975330 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftm8\" (UniqueName: \"kubernetes.io/projected/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-kube-api-access-6ftm8\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.975525 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-logs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.975571 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.975588 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.975631 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.975661 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-config-data\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.976027 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-logs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.981271 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.991038 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.991038 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.991149 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-config-data\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:34 crc kubenswrapper[4725]: I0227 06:34:34.994376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftm8\" (UniqueName: \"kubernetes.io/projected/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-kube-api-access-6ftm8\") pod \"nova-api-0\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " pod="openstack/nova-api-0" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.165015 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.722430 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.788137 4725 generic.go:334] "Generic (PLEG): container finished" podID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerID="dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868" exitCode=0 Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.788180 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerDied","Data":"dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868"} Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.788202 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.788230 4725 scope.go:117] "RemoveContainer" containerID="66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.788216 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56","Type":"ContainerDied","Data":"3814464ffa5ffa2bcbee481ca1dd58663657b4409c26acbe833e3ab3cabed923"} Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.811482 4725 scope.go:117] "RemoveContainer" containerID="4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812188 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-sg-core-conf-yaml\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812223 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84qpq\" (UniqueName: \"kubernetes.io/projected/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-kube-api-access-84qpq\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812266 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-combined-ca-bundle\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812397 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-ceilometer-tls-certs\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812422 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-config-data\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812483 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-scripts\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812603 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-run-httpd\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.812630 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-log-httpd\") pod \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\" (UID: \"4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56\") " Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.814026 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.814889 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.818575 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-scripts" (OuterVolumeSpecName: "scripts") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.821918 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:35 crc kubenswrapper[4725]: W0227 06:34:35.828549 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bd2fa1a_0d1f_418b_9647_d4025d10c3a7.slice/crio-56fcf121ab757d015fd0616df524c170ecab0542aed0b3fa5a6368805454ae2b WatchSource:0}: Error finding container 56fcf121ab757d015fd0616df524c170ecab0542aed0b3fa5a6368805454ae2b: Status 404 returned error can't find the container with id 56fcf121ab757d015fd0616df524c170ecab0542aed0b3fa5a6368805454ae2b Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.830945 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-kube-api-access-84qpq" (OuterVolumeSpecName: "kube-api-access-84qpq") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "kube-api-access-84qpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.850507 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.869778 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.902807 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.913147 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-config-data" (OuterVolumeSpecName: "config-data") pod "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" (UID: "4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914233 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914264 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914275 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914378 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914393 4725 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914403 4725 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914414 4725 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.914426 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84qpq\" (UniqueName: \"kubernetes.io/projected/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56-kube-api-access-84qpq\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.933492 4725 scope.go:117] "RemoveContainer" containerID="dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.959568 4725 scope.go:117] "RemoveContainer" containerID="285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.984253 4725 scope.go:117] "RemoveContainer" containerID="66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6" Feb 27 06:34:35 crc kubenswrapper[4725]: E0227 06:34:35.984703 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6\": container with ID starting with 66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6 not found: ID does not exist" containerID="66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.984743 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6"} err="failed to get container status \"66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6\": rpc error: code = NotFound desc = could not find container \"66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6\": container with ID starting with 66853ff5498abf6f5ad401dedd454cba3168cac2c539e459bf811d15a66c44c6 not found: ID does not exist" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.984772 4725 scope.go:117] "RemoveContainer" containerID="4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b" Feb 27 06:34:35 crc kubenswrapper[4725]: E0227 06:34:35.985057 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b\": container with ID starting with 4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b not found: ID does not exist" containerID="4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.985096 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b"} err="failed to get container status \"4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b\": rpc error: code = NotFound desc = could not find container \"4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b\": container with ID starting with 4b3661e7ff3e25a6ed2e649174a38bfec2f3be42fdaa288effab379b55b9460b not found: ID does not exist" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.985125 4725 scope.go:117] "RemoveContainer" containerID="dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868" Feb 27 06:34:35 crc kubenswrapper[4725]: E0227 06:34:35.985379 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868\": container with ID starting with dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868 not found: ID does not exist" containerID="dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.985410 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868"} err="failed to get container status \"dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868\": rpc error: code = NotFound desc = could not find container \"dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868\": container with ID starting with dbd19912f408491daa3dc202ef9dfef51a2a54106fdc494e9827e799dca34868 not found: ID does not exist" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.985428 4725 scope.go:117] "RemoveContainer" containerID="285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1" Feb 27 06:34:35 crc kubenswrapper[4725]: E0227 06:34:35.985633 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1\": container with ID starting with 285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1 not found: ID does not exist" containerID="285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1" Feb 27 06:34:35 crc kubenswrapper[4725]: I0227 06:34:35.985656 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1"} err="failed to get container status \"285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1\": rpc error: code = NotFound desc = could not find container \"285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1\": container with ID starting with 285783b864cc2ae11add4b7c39e08738fa01fc1566c84c2df76d3ada19831fa1 not found: ID does not exist" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.132025 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.138971 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.163416 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:36 crc kubenswrapper[4725]: E0227 06:34:36.163844 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-notification-agent" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.163859 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-notification-agent" Feb 27 06:34:36 crc kubenswrapper[4725]: E0227 06:34:36.163888 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="proxy-httpd" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.163895 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="proxy-httpd" Feb 27 06:34:36 crc kubenswrapper[4725]: E0227 06:34:36.163907 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-central-agent" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.163913 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-central-agent" Feb 27 06:34:36 crc kubenswrapper[4725]: E0227 06:34:36.163929 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="sg-core" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.163935 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="sg-core" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.164132 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-notification-agent" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.164143 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="sg-core" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.164157 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="proxy-httpd" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.164171 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" containerName="ceilometer-central-agent" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.165880 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.169102 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.169946 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.170197 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.212837 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221184 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sh88\" (UniqueName: \"kubernetes.io/projected/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-kube-api-access-5sh88\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221268 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-log-httpd\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221354 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-scripts\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221481 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221516 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-config-data\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221535 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221556 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.221657 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-run-httpd\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.275719 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56" path="/var/lib/kubelet/pods/4fd13ead-cca8-4b7c-b179-2a3f4d1cbe56/volumes" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.276760 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5c1d61-206a-4731-90d7-19755263893a" path="/var/lib/kubelet/pods/cf5c1d61-206a-4731-90d7-19755263893a/volumes" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.323556 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-log-httpd\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.323921 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-scripts\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.324394 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-log-httpd\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.325001 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.325197 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-config-data\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.325236 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.325267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.325387 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-run-httpd\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.325520 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sh88\" (UniqueName: \"kubernetes.io/projected/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-kube-api-access-5sh88\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.325882 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-run-httpd\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.328913 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-scripts\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.329390 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.329539 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.329585 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-config-data\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.330099 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.357164 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sh88\" (UniqueName: \"kubernetes.io/projected/c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd-kube-api-access-5sh88\") pod \"ceilometer-0\" (UID: \"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd\") " pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.485862 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.798411 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7","Type":"ContainerStarted","Data":"d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48"} Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.798834 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7","Type":"ContainerStarted","Data":"eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac"} Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.798850 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7","Type":"ContainerStarted","Data":"56fcf121ab757d015fd0616df524c170ecab0542aed0b3fa5a6368805454ae2b"} Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.822834 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.822815485 podStartE2EDuration="2.822815485s" podCreationTimestamp="2026-02-27 06:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:36.814167071 +0000 UTC m=+1455.276787660" watchObservedRunningTime="2026-02-27 06:34:36.822815485 +0000 UTC m=+1455.285436054" Feb 27 06:34:36 crc kubenswrapper[4725]: I0227 06:34:36.948615 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 06:34:36 crc kubenswrapper[4725]: W0227 06:34:36.961418 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9649343_ccb7_4cbf_a1dd_4d4c0cc3dedd.slice/crio-9a533a45e51831a47c72c688ac1ba2f920cc6475b17dbb2da0e84007c69dd24e WatchSource:0}: Error finding container 9a533a45e51831a47c72c688ac1ba2f920cc6475b17dbb2da0e84007c69dd24e: Status 404 returned error can't find the container with id 9a533a45e51831a47c72c688ac1ba2f920cc6475b17dbb2da0e84007c69dd24e Feb 27 06:34:37 crc kubenswrapper[4725]: I0227 06:34:37.810636 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd","Type":"ContainerStarted","Data":"53167d776c0f517495fb9e0d4d164b72d6a79c70c8986357eb13096c4c89bdf2"} Feb 27 06:34:37 crc kubenswrapper[4725]: I0227 06:34:37.810969 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd","Type":"ContainerStarted","Data":"b5c18f942fd1b7301288a11fd6c555a193cf985af2d19d696e861feb3dfd6de3"} Feb 27 06:34:37 crc kubenswrapper[4725]: I0227 06:34:37.810987 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd","Type":"ContainerStarted","Data":"9a533a45e51831a47c72c688ac1ba2f920cc6475b17dbb2da0e84007c69dd24e"} Feb 27 06:34:38 crc kubenswrapper[4725]: I0227 06:34:38.836357 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd","Type":"ContainerStarted","Data":"c51da686b25849c5e614629b937f1daf9b7b7c14b45562dc9060833a4e8db21c"} Feb 27 06:34:39 crc kubenswrapper[4725]: I0227 06:34:39.200853 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:39 crc kubenswrapper[4725]: I0227 06:34:39.228494 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:39 crc kubenswrapper[4725]: I0227 06:34:39.451202 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 06:34:39 crc kubenswrapper[4725]: I0227 06:34:39.451947 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 06:34:39 crc kubenswrapper[4725]: I0227 06:34:39.866269 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.001780 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wzwpl"] Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.003180 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.006745 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.006804 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.027992 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wzwpl"] Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.034007 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-config-data\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.034138 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.034252 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-scripts\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.034304 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw425\" (UniqueName: \"kubernetes.io/projected/64cdb049-7b8a-4c79-8c77-678172f96778-kube-api-access-lw425\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.136122 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-config-data\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.136551 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.136592 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-scripts\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.136614 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw425\" (UniqueName: \"kubernetes.io/projected/64cdb049-7b8a-4c79-8c77-678172f96778-kube-api-access-lw425\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.145470 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-config-data\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.145831 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-scripts\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.146984 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.157982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw425\" (UniqueName: \"kubernetes.io/projected/64cdb049-7b8a-4c79-8c77-678172f96778-kube-api-access-lw425\") pod \"nova-cell1-cell-mapping-wzwpl\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.216067 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.306310 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f575c69f9-k8vk2"] Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.310665 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" podUID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerName="dnsmasq-dns" containerID="cri-o://fd1998b0e1b622db24ec2e1af09893d4ac39704d70bc87e391e687da67841d45" gracePeriod=10 Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.322065 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:40 crc kubenswrapper[4725]: E0227 06:34:40.346633 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod837a7d0c_6989_4582_9d05_8b5c73db83a2.slice/crio-fd1998b0e1b622db24ec2e1af09893d4ac39704d70bc87e391e687da67841d45.scope\": RecentStats: unable to find data in memory cache]" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.478179 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.478691 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.864205 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd","Type":"ContainerStarted","Data":"4b843f26b45fe11a7ac8af1ec20a868d734ea988f936b7360b76323230b55930"} Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.864863 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.872907 4725 generic.go:334] "Generic (PLEG): container finished" podID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerID="fd1998b0e1b622db24ec2e1af09893d4ac39704d70bc87e391e687da67841d45" exitCode=0 Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.873189 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" event={"ID":"837a7d0c-6989-4582-9d05-8b5c73db83a2","Type":"ContainerDied","Data":"fd1998b0e1b622db24ec2e1af09893d4ac39704d70bc87e391e687da67841d45"} Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.873264 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" event={"ID":"837a7d0c-6989-4582-9d05-8b5c73db83a2","Type":"ContainerDied","Data":"1b1cc66f600bc539e474badecc623129859a77593b7c79139864e185b900188f"} Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.873277 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b1cc66f600bc539e474badecc623129859a77593b7c79139864e185b900188f" Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.882139 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wzwpl"] Feb 27 06:34:40 crc kubenswrapper[4725]: I0227 06:34:40.905310 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.090135435 podStartE2EDuration="4.905269328s" podCreationTimestamp="2026-02-27 06:34:36 +0000 UTC" firstStartedPulling="2026-02-27 06:34:36.969381904 +0000 UTC m=+1455.432002473" lastFinishedPulling="2026-02-27 06:34:39.784515797 +0000 UTC m=+1458.247136366" observedRunningTime="2026-02-27 06:34:40.8918439 +0000 UTC m=+1459.354464479" watchObservedRunningTime="2026-02-27 06:34:40.905269328 +0000 UTC m=+1459.367889897" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.007674 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.055841 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh24v\" (UniqueName: \"kubernetes.io/projected/837a7d0c-6989-4582-9d05-8b5c73db83a2-kube-api-access-jh24v\") pod \"837a7d0c-6989-4582-9d05-8b5c73db83a2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.055899 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-config\") pod \"837a7d0c-6989-4582-9d05-8b5c73db83a2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.056016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-nb\") pod \"837a7d0c-6989-4582-9d05-8b5c73db83a2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.056138 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-swift-storage-0\") pod \"837a7d0c-6989-4582-9d05-8b5c73db83a2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.056163 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-sb\") pod \"837a7d0c-6989-4582-9d05-8b5c73db83a2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.056200 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-svc\") pod \"837a7d0c-6989-4582-9d05-8b5c73db83a2\" (UID: \"837a7d0c-6989-4582-9d05-8b5c73db83a2\") " Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.066521 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837a7d0c-6989-4582-9d05-8b5c73db83a2-kube-api-access-jh24v" (OuterVolumeSpecName: "kube-api-access-jh24v") pod "837a7d0c-6989-4582-9d05-8b5c73db83a2" (UID: "837a7d0c-6989-4582-9d05-8b5c73db83a2"). InnerVolumeSpecName "kube-api-access-jh24v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.123781 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "837a7d0c-6989-4582-9d05-8b5c73db83a2" (UID: "837a7d0c-6989-4582-9d05-8b5c73db83a2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.145020 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "837a7d0c-6989-4582-9d05-8b5c73db83a2" (UID: "837a7d0c-6989-4582-9d05-8b5c73db83a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.159230 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.159263 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.159273 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh24v\" (UniqueName: \"kubernetes.io/projected/837a7d0c-6989-4582-9d05-8b5c73db83a2-kube-api-access-jh24v\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.168752 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "837a7d0c-6989-4582-9d05-8b5c73db83a2" (UID: "837a7d0c-6989-4582-9d05-8b5c73db83a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.189201 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-config" (OuterVolumeSpecName: "config") pod "837a7d0c-6989-4582-9d05-8b5c73db83a2" (UID: "837a7d0c-6989-4582-9d05-8b5c73db83a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.200513 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "837a7d0c-6989-4582-9d05-8b5c73db83a2" (UID: "837a7d0c-6989-4582-9d05-8b5c73db83a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.260649 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.260685 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.260699 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837a7d0c-6989-4582-9d05-8b5c73db83a2-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.881905 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wzwpl" event={"ID":"64cdb049-7b8a-4c79-8c77-678172f96778","Type":"ContainerStarted","Data":"655cebfcb00fd50cafc68d9c30ef5633763964a6ccade3ab5376920ab1631492"} Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.881947 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wzwpl" event={"ID":"64cdb049-7b8a-4c79-8c77-678172f96778","Type":"ContainerStarted","Data":"fc2ee24f6cbdf6e274662e20d902176dc02b47122c5f5351065ebc966e9c601b"} Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.882025 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f575c69f9-k8vk2" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.915158 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wzwpl" podStartSLOduration=2.915137957 podStartE2EDuration="2.915137957s" podCreationTimestamp="2026-02-27 06:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:41.910051093 +0000 UTC m=+1460.372671692" watchObservedRunningTime="2026-02-27 06:34:41.915137957 +0000 UTC m=+1460.377758536" Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.940273 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f575c69f9-k8vk2"] Feb 27 06:34:41 crc kubenswrapper[4725]: I0227 06:34:41.947656 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f575c69f9-k8vk2"] Feb 27 06:34:42 crc kubenswrapper[4725]: I0227 06:34:42.288795 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837a7d0c-6989-4582-9d05-8b5c73db83a2" path="/var/lib/kubelet/pods/837a7d0c-6989-4582-9d05-8b5c73db83a2/volumes" Feb 27 06:34:45 crc kubenswrapper[4725]: I0227 06:34:45.165251 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:34:45 crc kubenswrapper[4725]: I0227 06:34:45.165374 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:34:46 crc kubenswrapper[4725]: I0227 06:34:46.185526 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.229:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:46 crc kubenswrapper[4725]: I0227 06:34:46.185537 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.229:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:34:46 crc kubenswrapper[4725]: I0227 06:34:46.960094 4725 generic.go:334] "Generic (PLEG): container finished" podID="64cdb049-7b8a-4c79-8c77-678172f96778" containerID="655cebfcb00fd50cafc68d9c30ef5633763964a6ccade3ab5376920ab1631492" exitCode=0 Feb 27 06:34:46 crc kubenswrapper[4725]: I0227 06:34:46.960171 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wzwpl" event={"ID":"64cdb049-7b8a-4c79-8c77-678172f96778","Type":"ContainerDied","Data":"655cebfcb00fd50cafc68d9c30ef5633763964a6ccade3ab5376920ab1631492"} Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.446657 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.628038 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-scripts\") pod \"64cdb049-7b8a-4c79-8c77-678172f96778\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.628379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-combined-ca-bundle\") pod \"64cdb049-7b8a-4c79-8c77-678172f96778\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.628457 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw425\" (UniqueName: \"kubernetes.io/projected/64cdb049-7b8a-4c79-8c77-678172f96778-kube-api-access-lw425\") pod \"64cdb049-7b8a-4c79-8c77-678172f96778\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.628522 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-config-data\") pod \"64cdb049-7b8a-4c79-8c77-678172f96778\" (UID: \"64cdb049-7b8a-4c79-8c77-678172f96778\") " Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.634627 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-scripts" (OuterVolumeSpecName: "scripts") pod "64cdb049-7b8a-4c79-8c77-678172f96778" (UID: "64cdb049-7b8a-4c79-8c77-678172f96778"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.636079 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64cdb049-7b8a-4c79-8c77-678172f96778-kube-api-access-lw425" (OuterVolumeSpecName: "kube-api-access-lw425") pod "64cdb049-7b8a-4c79-8c77-678172f96778" (UID: "64cdb049-7b8a-4c79-8c77-678172f96778"). InnerVolumeSpecName "kube-api-access-lw425". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.676526 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64cdb049-7b8a-4c79-8c77-678172f96778" (UID: "64cdb049-7b8a-4c79-8c77-678172f96778"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.684453 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-config-data" (OuterVolumeSpecName: "config-data") pod "64cdb049-7b8a-4c79-8c77-678172f96778" (UID: "64cdb049-7b8a-4c79-8c77-678172f96778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.731216 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.731248 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw425\" (UniqueName: \"kubernetes.io/projected/64cdb049-7b8a-4c79-8c77-678172f96778-kube-api-access-lw425\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.731260 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.731268 4725 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cdb049-7b8a-4c79-8c77-678172f96778-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.985732 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wzwpl" event={"ID":"64cdb049-7b8a-4c79-8c77-678172f96778","Type":"ContainerDied","Data":"fc2ee24f6cbdf6e274662e20d902176dc02b47122c5f5351065ebc966e9c601b"} Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.985767 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc2ee24f6cbdf6e274662e20d902176dc02b47122c5f5351065ebc966e9c601b" Feb 27 06:34:48 crc kubenswrapper[4725]: I0227 06:34:48.985785 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wzwpl" Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.197101 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.197667 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c0a6e56c-1da8-4476-b8c9-f727c831c6c6" containerName="nova-scheduler-scheduler" containerID="cri-o://42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" gracePeriod=30 Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.212314 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.212596 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-log" containerID="cri-o://eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac" gracePeriod=30 Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.212711 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-api" containerID="cri-o://d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48" gracePeriod=30 Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.241962 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.242193 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-log" containerID="cri-o://65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03" gracePeriod=30 Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.242382 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-metadata" containerID="cri-o://7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5" gracePeriod=30 Feb 27 06:34:49 crc kubenswrapper[4725]: E0227 06:34:49.833801 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 06:34:49 crc kubenswrapper[4725]: E0227 06:34:49.839057 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 06:34:49 crc kubenswrapper[4725]: E0227 06:34:49.846688 4725 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 06:34:49 crc kubenswrapper[4725]: E0227 06:34:49.846752 4725 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c0a6e56c-1da8-4476-b8c9-f727c831c6c6" containerName="nova-scheduler-scheduler" Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.999749 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4222491-f5d6-40be-832a-728651adf29f" containerID="65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03" exitCode=143 Feb 27 06:34:49 crc kubenswrapper[4725]: I0227 06:34:49.999813 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4222491-f5d6-40be-832a-728651adf29f","Type":"ContainerDied","Data":"65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03"} Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.001999 4725 generic.go:334] "Generic (PLEG): container finished" podID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerID="eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac" exitCode=143 Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.002021 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7","Type":"ContainerDied","Data":"eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac"} Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.551653 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.597605 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-public-tls-certs\") pod \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.597695 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ftm8\" (UniqueName: \"kubernetes.io/projected/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-kube-api-access-6ftm8\") pod \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.597857 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-logs\") pod \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.597896 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-internal-tls-certs\") pod \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.597989 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-combined-ca-bundle\") pod \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.598101 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-config-data\") pod \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\" (UID: \"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.601943 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-logs" (OuterVolumeSpecName: "logs") pod "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" (UID: "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.623226 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-kube-api-access-6ftm8" (OuterVolumeSpecName: "kube-api-access-6ftm8") pod "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" (UID: "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7"). InnerVolumeSpecName "kube-api-access-6ftm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.639308 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-config-data" (OuterVolumeSpecName: "config-data") pod "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" (UID: "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.662352 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" (UID: "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.664672 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" (UID: "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.673196 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" (UID: "1bd2fa1a-0d1f-418b-9647-d4025d10c3a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.698805 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.703152 4725 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.703181 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ftm8\" (UniqueName: \"kubernetes.io/projected/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-kube-api-access-6ftm8\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.703191 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.703202 4725 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.703211 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.703220 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.804316 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-nova-metadata-tls-certs\") pod \"c4222491-f5d6-40be-832a-728651adf29f\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.804431 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shvmp\" (UniqueName: \"kubernetes.io/projected/c4222491-f5d6-40be-832a-728651adf29f-kube-api-access-shvmp\") pod \"c4222491-f5d6-40be-832a-728651adf29f\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.804613 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4222491-f5d6-40be-832a-728651adf29f-logs\") pod \"c4222491-f5d6-40be-832a-728651adf29f\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.804701 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-config-data\") pod \"c4222491-f5d6-40be-832a-728651adf29f\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.804837 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-combined-ca-bundle\") pod \"c4222491-f5d6-40be-832a-728651adf29f\" (UID: \"c4222491-f5d6-40be-832a-728651adf29f\") " Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.805115 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4222491-f5d6-40be-832a-728651adf29f-logs" (OuterVolumeSpecName: "logs") pod "c4222491-f5d6-40be-832a-728651adf29f" (UID: "c4222491-f5d6-40be-832a-728651adf29f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.805455 4725 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4222491-f5d6-40be-832a-728651adf29f-logs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.811255 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4222491-f5d6-40be-832a-728651adf29f-kube-api-access-shvmp" (OuterVolumeSpecName: "kube-api-access-shvmp") pod "c4222491-f5d6-40be-832a-728651adf29f" (UID: "c4222491-f5d6-40be-832a-728651adf29f"). InnerVolumeSpecName "kube-api-access-shvmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.830617 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-config-data" (OuterVolumeSpecName: "config-data") pod "c4222491-f5d6-40be-832a-728651adf29f" (UID: "c4222491-f5d6-40be-832a-728651adf29f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.832742 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4222491-f5d6-40be-832a-728651adf29f" (UID: "c4222491-f5d6-40be-832a-728651adf29f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.872300 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c4222491-f5d6-40be-832a-728651adf29f" (UID: "c4222491-f5d6-40be-832a-728651adf29f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.907467 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shvmp\" (UniqueName: \"kubernetes.io/projected/c4222491-f5d6-40be-832a-728651adf29f-kube-api-access-shvmp\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.907504 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.907514 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:50 crc kubenswrapper[4725]: I0227 06:34:50.907522 4725 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4222491-f5d6-40be-832a-728651adf29f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.015848 4725 generic.go:334] "Generic (PLEG): container finished" podID="c4222491-f5d6-40be-832a-728651adf29f" containerID="7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5" exitCode=0 Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.015998 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.021849 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4222491-f5d6-40be-832a-728651adf29f","Type":"ContainerDied","Data":"7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5"} Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.021924 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4222491-f5d6-40be-832a-728651adf29f","Type":"ContainerDied","Data":"21b4a4efad6c0c730784ad2031ca63baf8e1e529cb1cb62a93db313f5a41d383"} Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.021957 4725 scope.go:117] "RemoveContainer" containerID="7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.025581 4725 generic.go:334] "Generic (PLEG): container finished" podID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerID="d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48" exitCode=0 Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.025631 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7","Type":"ContainerDied","Data":"d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48"} Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.025657 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1bd2fa1a-0d1f-418b-9647-d4025d10c3a7","Type":"ContainerDied","Data":"56fcf121ab757d015fd0616df524c170ecab0542aed0b3fa5a6368805454ae2b"} Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.025716 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.081612 4725 scope.go:117] "RemoveContainer" containerID="65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.085105 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.105024 4725 scope.go:117] "RemoveContainer" containerID="7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.105861 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5\": container with ID starting with 7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5 not found: ID does not exist" containerID="7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.105985 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5"} err="failed to get container status \"7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5\": rpc error: code = NotFound desc = could not find container \"7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5\": container with ID starting with 7c4a65b80e14e976071ef4edec11ff79428d416841654e1b8cc42e3a33e05db5 not found: ID does not exist" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.106036 4725 scope.go:117] "RemoveContainer" containerID="65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.108379 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.111658 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03\": container with ID starting with 65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03 not found: ID does not exist" containerID="65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.111724 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03"} err="failed to get container status \"65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03\": rpc error: code = NotFound desc = could not find container \"65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03\": container with ID starting with 65bf1db1ada4f99cdd631e45fe3f8eeb47b37bc8470708f22ef57f4ee91dba03 not found: ID does not exist" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.111773 4725 scope.go:117] "RemoveContainer" containerID="d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.130803 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.139050 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.156473 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.157155 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64cdb049-7b8a-4c79-8c77-678172f96778" containerName="nova-manage" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157176 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="64cdb049-7b8a-4c79-8c77-678172f96778" containerName="nova-manage" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.157198 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerName="init" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157208 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerName="init" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.157225 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-api" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157235 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-api" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.157256 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-log" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157265 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-log" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.157314 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerName="dnsmasq-dns" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157325 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerName="dnsmasq-dns" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.157347 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-metadata" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157360 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-metadata" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.157385 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-log" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157397 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-log" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157685 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-log" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157712 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="64cdb049-7b8a-4c79-8c77-678172f96778" containerName="nova-manage" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157731 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4222491-f5d6-40be-832a-728651adf29f" containerName="nova-metadata-metadata" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157746 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="837a7d0c-6989-4582-9d05-8b5c73db83a2" containerName="dnsmasq-dns" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157768 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-log" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.157785 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" containerName="nova-api-api" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.159234 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.165039 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.165259 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.170584 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.172799 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.176986 4725 scope.go:117] "RemoveContainer" containerID="eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.187344 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.187647 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.187762 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.201002 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216437 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-config-data\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216515 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216568 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216595 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216745 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a660f84-32ef-4def-90b6-fd4a39e117dc-logs\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216828 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216852 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9a58f39-222d-495a-9cde-272e31f1efae-logs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.216919 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-658v5\" (UniqueName: \"kubernetes.io/projected/a9a58f39-222d-495a-9cde-272e31f1efae-kube-api-access-658v5\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.217012 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2ht6\" (UniqueName: \"kubernetes.io/projected/7a660f84-32ef-4def-90b6-fd4a39e117dc-kube-api-access-h2ht6\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.217070 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.217134 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-config-data\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.220569 4725 scope.go:117] "RemoveContainer" containerID="d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.220661 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.220930 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48\": container with ID starting with d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48 not found: ID does not exist" containerID="d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.220958 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48"} err="failed to get container status \"d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48\": rpc error: code = NotFound desc = could not find container \"d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48\": container with ID starting with d0f8936ffa828033ada7d5a15f337165f8d04ee72ba4079a045a117606511d48 not found: ID does not exist" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.220977 4725 scope.go:117] "RemoveContainer" containerID="eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac" Feb 27 06:34:51 crc kubenswrapper[4725]: E0227 06:34:51.221202 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac\": container with ID starting with eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac not found: ID does not exist" containerID="eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.221220 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac"} err="failed to get container status \"eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac\": rpc error: code = NotFound desc = could not find container \"eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac\": container with ID starting with eb39536da9c7f7c9ecbaaf3de3d3ad5080f797b0f472ff2ec6930d36743da4ac not found: ID does not exist" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318542 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2ht6\" (UniqueName: \"kubernetes.io/projected/7a660f84-32ef-4def-90b6-fd4a39e117dc-kube-api-access-h2ht6\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318585 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318623 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-config-data\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318686 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-config-data\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318714 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318743 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318764 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318858 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a660f84-32ef-4def-90b6-fd4a39e117dc-logs\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318895 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318916 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9a58f39-222d-495a-9cde-272e31f1efae-logs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.318949 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-658v5\" (UniqueName: \"kubernetes.io/projected/a9a58f39-222d-495a-9cde-272e31f1efae-kube-api-access-658v5\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.319316 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a660f84-32ef-4def-90b6-fd4a39e117dc-logs\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.319609 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9a58f39-222d-495a-9cde-272e31f1efae-logs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.321852 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.322178 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-config-data\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.322680 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.322854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.322961 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-config-data\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.323077 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a660f84-32ef-4def-90b6-fd4a39e117dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.325820 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a58f39-222d-495a-9cde-272e31f1efae-public-tls-certs\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.340608 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-658v5\" (UniqueName: \"kubernetes.io/projected/a9a58f39-222d-495a-9cde-272e31f1efae-kube-api-access-658v5\") pod \"nova-api-0\" (UID: \"a9a58f39-222d-495a-9cde-272e31f1efae\") " pod="openstack/nova-api-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.346865 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2ht6\" (UniqueName: \"kubernetes.io/projected/7a660f84-32ef-4def-90b6-fd4a39e117dc-kube-api-access-h2ht6\") pod \"nova-metadata-0\" (UID: \"7a660f84-32ef-4def-90b6-fd4a39e117dc\") " pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.489155 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 06:34:51 crc kubenswrapper[4725]: I0227 06:34:51.503532 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 06:34:52 crc kubenswrapper[4725]: I0227 06:34:52.037270 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 06:34:52 crc kubenswrapper[4725]: W0227 06:34:52.043396 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a660f84_32ef_4def_90b6_fd4a39e117dc.slice/crio-78c6f1a763ac8fd6b2bab449b19f672dd59b081b6ec73261ab08380eb4b88878 WatchSource:0}: Error finding container 78c6f1a763ac8fd6b2bab449b19f672dd59b081b6ec73261ab08380eb4b88878: Status 404 returned error can't find the container with id 78c6f1a763ac8fd6b2bab449b19f672dd59b081b6ec73261ab08380eb4b88878 Feb 27 06:34:52 crc kubenswrapper[4725]: I0227 06:34:52.051380 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 06:34:52 crc kubenswrapper[4725]: I0227 06:34:52.264265 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd2fa1a-0d1f-418b-9647-d4025d10c3a7" path="/var/lib/kubelet/pods/1bd2fa1a-0d1f-418b-9647-d4025d10c3a7/volumes" Feb 27 06:34:52 crc kubenswrapper[4725]: I0227 06:34:52.264883 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4222491-f5d6-40be-832a-728651adf29f" path="/var/lib/kubelet/pods/c4222491-f5d6-40be-832a-728651adf29f/volumes" Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.072838 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a660f84-32ef-4def-90b6-fd4a39e117dc","Type":"ContainerStarted","Data":"50df5ec48e7486a629cf195f30f35278f7c87fd1899bfbce7fda4082cb616309"} Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.073130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a660f84-32ef-4def-90b6-fd4a39e117dc","Type":"ContainerStarted","Data":"d726e02eb64d4a8690c42d680834cb6a2a16c3ce2645da31822fc964559c31a0"} Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.073147 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a660f84-32ef-4def-90b6-fd4a39e117dc","Type":"ContainerStarted","Data":"78c6f1a763ac8fd6b2bab449b19f672dd59b081b6ec73261ab08380eb4b88878"} Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.076052 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9a58f39-222d-495a-9cde-272e31f1efae","Type":"ContainerStarted","Data":"e2dff5eee785e0e7f067e07504d52107b68e6bf0e9e65f7dfcad8a66f1c198b3"} Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.076117 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9a58f39-222d-495a-9cde-272e31f1efae","Type":"ContainerStarted","Data":"0c2decc649883413fc7346c50d430150835023210d15292b024a83fa5170127f"} Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.076137 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9a58f39-222d-495a-9cde-272e31f1efae","Type":"ContainerStarted","Data":"1a420a05eb581f1961c75fc02e96f2293b9f22900f327b920e73f842241562b8"} Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.107055 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.107006174 podStartE2EDuration="2.107006174s" podCreationTimestamp="2026-02-27 06:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:53.099192394 +0000 UTC m=+1471.561813043" watchObservedRunningTime="2026-02-27 06:34:53.107006174 +0000 UTC m=+1471.569626743" Feb 27 06:34:53 crc kubenswrapper[4725]: I0227 06:34:53.130991 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.130968409 podStartE2EDuration="2.130968409s" podCreationTimestamp="2026-02-27 06:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:53.12001412 +0000 UTC m=+1471.582634689" watchObservedRunningTime="2026-02-27 06:34:53.130968409 +0000 UTC m=+1471.593588988" Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.661881 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.692956 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktlr5\" (UniqueName: \"kubernetes.io/projected/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-kube-api-access-ktlr5\") pod \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.693541 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-config-data\") pod \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.693576 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-combined-ca-bundle\") pod \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\" (UID: \"c0a6e56c-1da8-4476-b8c9-f727c831c6c6\") " Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.707091 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-kube-api-access-ktlr5" (OuterVolumeSpecName: "kube-api-access-ktlr5") pod "c0a6e56c-1da8-4476-b8c9-f727c831c6c6" (UID: "c0a6e56c-1da8-4476-b8c9-f727c831c6c6"). InnerVolumeSpecName "kube-api-access-ktlr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.741368 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-config-data" (OuterVolumeSpecName: "config-data") pod "c0a6e56c-1da8-4476-b8c9-f727c831c6c6" (UID: "c0a6e56c-1da8-4476-b8c9-f727c831c6c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.766009 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0a6e56c-1da8-4476-b8c9-f727c831c6c6" (UID: "c0a6e56c-1da8-4476-b8c9-f727c831c6c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.795600 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktlr5\" (UniqueName: \"kubernetes.io/projected/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-kube-api-access-ktlr5\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.795629 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:54 crc kubenswrapper[4725]: I0227 06:34:54.795640 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a6e56c-1da8-4476-b8c9-f727c831c6c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.100143 4725 generic.go:334] "Generic (PLEG): container finished" podID="c0a6e56c-1da8-4476-b8c9-f727c831c6c6" containerID="42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" exitCode=0 Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.100222 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a6e56c-1da8-4476-b8c9-f727c831c6c6","Type":"ContainerDied","Data":"42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce"} Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.100284 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a6e56c-1da8-4476-b8c9-f727c831c6c6","Type":"ContainerDied","Data":"f6494d3e38c77eda85279e0e38e012be5b0a33617671004af32d2a0004c1e245"} Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.100237 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.100314 4725 scope.go:117] "RemoveContainer" containerID="42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.125883 4725 scope.go:117] "RemoveContainer" containerID="42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" Feb 27 06:34:55 crc kubenswrapper[4725]: E0227 06:34:55.126365 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce\": container with ID starting with 42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce not found: ID does not exist" containerID="42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.126412 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce"} err="failed to get container status \"42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce\": rpc error: code = NotFound desc = could not find container \"42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce\": container with ID starting with 42b63fe6c3de78e4356d25d9977344754592fcf36e8841ed7482d7c7ec9a87ce not found: ID does not exist" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.167843 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.180341 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.210304 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:55 crc kubenswrapper[4725]: E0227 06:34:55.210783 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a6e56c-1da8-4476-b8c9-f727c831c6c6" containerName="nova-scheduler-scheduler" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.210805 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a6e56c-1da8-4476-b8c9-f727c831c6c6" containerName="nova-scheduler-scheduler" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.211073 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a6e56c-1da8-4476-b8c9-f727c831c6c6" containerName="nova-scheduler-scheduler" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.212038 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.215107 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.224880 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.304175 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39eae0d-a597-445f-9134-7e2d9f5e82ff-config-data\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.304208 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39eae0d-a597-445f-9134-7e2d9f5e82ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.304458 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjmb4\" (UniqueName: \"kubernetes.io/projected/d39eae0d-a597-445f-9134-7e2d9f5e82ff-kube-api-access-jjmb4\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.406995 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjmb4\" (UniqueName: \"kubernetes.io/projected/d39eae0d-a597-445f-9134-7e2d9f5e82ff-kube-api-access-jjmb4\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.407140 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39eae0d-a597-445f-9134-7e2d9f5e82ff-config-data\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.407181 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39eae0d-a597-445f-9134-7e2d9f5e82ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.412363 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39eae0d-a597-445f-9134-7e2d9f5e82ff-config-data\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.415025 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39eae0d-a597-445f-9134-7e2d9f5e82ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.435001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjmb4\" (UniqueName: \"kubernetes.io/projected/d39eae0d-a597-445f-9134-7e2d9f5e82ff-kube-api-access-jjmb4\") pod \"nova-scheduler-0\" (UID: \"d39eae0d-a597-445f-9134-7e2d9f5e82ff\") " pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.580682 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 06:34:55 crc kubenswrapper[4725]: I0227 06:34:55.905171 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 06:34:55 crc kubenswrapper[4725]: W0227 06:34:55.912914 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd39eae0d_a597_445f_9134_7e2d9f5e82ff.slice/crio-312e1f667d0ef640b1f8525e098638156ed19a60de5b649a470bcedbe3d6dcb0 WatchSource:0}: Error finding container 312e1f667d0ef640b1f8525e098638156ed19a60de5b649a470bcedbe3d6dcb0: Status 404 returned error can't find the container with id 312e1f667d0ef640b1f8525e098638156ed19a60de5b649a470bcedbe3d6dcb0 Feb 27 06:34:56 crc kubenswrapper[4725]: I0227 06:34:56.122231 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d39eae0d-a597-445f-9134-7e2d9f5e82ff","Type":"ContainerStarted","Data":"312e1f667d0ef640b1f8525e098638156ed19a60de5b649a470bcedbe3d6dcb0"} Feb 27 06:34:56 crc kubenswrapper[4725]: I0227 06:34:56.269170 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a6e56c-1da8-4476-b8c9-f727c831c6c6" path="/var/lib/kubelet/pods/c0a6e56c-1da8-4476-b8c9-f727c831c6c6/volumes" Feb 27 06:34:56 crc kubenswrapper[4725]: I0227 06:34:56.490033 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 06:34:56 crc kubenswrapper[4725]: I0227 06:34:56.490100 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 06:34:57 crc kubenswrapper[4725]: I0227 06:34:57.137526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d39eae0d-a597-445f-9134-7e2d9f5e82ff","Type":"ContainerStarted","Data":"e1554c8b61a5e52429d972e0bddf00d141680646f093ad75d0d7b539f30a53a9"} Feb 27 06:34:57 crc kubenswrapper[4725]: I0227 06:34:57.182587 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.182554873 podStartE2EDuration="2.182554873s" podCreationTimestamp="2026-02-27 06:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:34:57.178857639 +0000 UTC m=+1475.641478268" watchObservedRunningTime="2026-02-27 06:34:57.182554873 +0000 UTC m=+1475.645175472" Feb 27 06:35:00 crc kubenswrapper[4725]: I0227 06:35:00.026077 4725 scope.go:117] "RemoveContainer" containerID="f389f3767c97b06e8758ffb74af43d9d796a2c4122053af652e1aed939e58b65" Feb 27 06:35:00 crc kubenswrapper[4725]: I0227 06:35:00.581490 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 06:35:01 crc kubenswrapper[4725]: I0227 06:35:01.490204 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 06:35:01 crc kubenswrapper[4725]: I0227 06:35:01.490282 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 06:35:01 crc kubenswrapper[4725]: I0227 06:35:01.506068 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:35:01 crc kubenswrapper[4725]: I0227 06:35:01.506117 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 06:35:02 crc kubenswrapper[4725]: I0227 06:35:02.514470 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7a660f84-32ef-4def-90b6-fd4a39e117dc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:35:02 crc kubenswrapper[4725]: I0227 06:35:02.514528 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7a660f84-32ef-4def-90b6-fd4a39e117dc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:35:02 crc kubenswrapper[4725]: I0227 06:35:02.530431 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9a58f39-222d-495a-9cde-272e31f1efae" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 06:35:02 crc kubenswrapper[4725]: I0227 06:35:02.530476 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9a58f39-222d-495a-9cde-272e31f1efae" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 06:35:05 crc kubenswrapper[4725]: I0227 06:35:05.581344 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 06:35:05 crc kubenswrapper[4725]: I0227 06:35:05.636473 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 06:35:06 crc kubenswrapper[4725]: I0227 06:35:06.318672 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 06:35:06 crc kubenswrapper[4725]: I0227 06:35:06.503136 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 06:35:11 crc kubenswrapper[4725]: I0227 06:35:11.497903 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 06:35:11 crc kubenswrapper[4725]: I0227 06:35:11.501461 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 06:35:11 crc kubenswrapper[4725]: I0227 06:35:11.510652 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 06:35:11 crc kubenswrapper[4725]: I0227 06:35:11.523567 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 06:35:11 crc kubenswrapper[4725]: I0227 06:35:11.524384 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 06:35:11 crc kubenswrapper[4725]: I0227 06:35:11.527142 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 06:35:11 crc kubenswrapper[4725]: I0227 06:35:11.534834 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 06:35:12 crc kubenswrapper[4725]: I0227 06:35:12.359720 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 06:35:12 crc kubenswrapper[4725]: I0227 06:35:12.398956 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 06:35:12 crc kubenswrapper[4725]: I0227 06:35:12.413380 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 06:35:20 crc kubenswrapper[4725]: I0227 06:35:20.423581 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:35:21 crc kubenswrapper[4725]: I0227 06:35:21.325435 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:35:23 crc kubenswrapper[4725]: I0227 06:35:23.534264 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerName="rabbitmq" containerID="cri-o://de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2" gracePeriod=604797 Feb 27 06:35:24 crc kubenswrapper[4725]: I0227 06:35:24.399553 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerName="rabbitmq" containerID="cri-o://52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7" gracePeriod=604797 Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.215571 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335302 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-config-data\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335362 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-server-conf\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335395 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335426 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-confd\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335503 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g2tv\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-kube-api-access-4g2tv\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335554 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-erlang-cookie\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335602 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb0a4e8-6f65-4961-9caa-18d66a6754af-pod-info\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335720 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-plugins-conf\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335770 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-plugins\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335815 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb0a4e8-6f65-4961-9caa-18d66a6754af-erlang-cookie-secret\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.335834 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-tls\") pod \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\" (UID: \"fdb0a4e8-6f65-4961-9caa-18d66a6754af\") " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.342460 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.342835 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.345984 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.350598 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb0a4e8-6f65-4961-9caa-18d66a6754af-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.350614 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fdb0a4e8-6f65-4961-9caa-18d66a6754af-pod-info" (OuterVolumeSpecName: "pod-info") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.359506 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-kube-api-access-4g2tv" (OuterVolumeSpecName: "kube-api-access-4g2tv") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "kube-api-access-4g2tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.366138 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.377537 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.417811 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-config-data" (OuterVolumeSpecName: "config-data") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.432469 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-server-conf" (OuterVolumeSpecName: "server-conf") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438859 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438887 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g2tv\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-kube-api-access-4g2tv\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438899 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438908 4725 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb0a4e8-6f65-4961-9caa-18d66a6754af-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438917 4725 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438927 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438937 4725 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb0a4e8-6f65-4961-9caa-18d66a6754af-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438945 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438953 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.438961 4725 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb0a4e8-6f65-4961-9caa-18d66a6754af-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.472904 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.523470 4725 generic.go:334] "Generic (PLEG): container finished" podID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerID="de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2" exitCode=0 Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.523514 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb0a4e8-6f65-4961-9caa-18d66a6754af","Type":"ContainerDied","Data":"de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2"} Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.523541 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb0a4e8-6f65-4961-9caa-18d66a6754af","Type":"ContainerDied","Data":"bc4b52f52ec537f563cacac677b069f6b43b675f9dc8e0b241c9d8b0349946e5"} Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.523556 4725 scope.go:117] "RemoveContainer" containerID="de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.523584 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.537454 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fdb0a4e8-6f65-4961-9caa-18d66a6754af" (UID: "fdb0a4e8-6f65-4961-9caa-18d66a6754af"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.541470 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb0a4e8-6f65-4961-9caa-18d66a6754af-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.541496 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.583394 4725 scope.go:117] "RemoveContainer" containerID="49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.698372 4725 scope.go:117] "RemoveContainer" containerID="de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2" Feb 27 06:35:25 crc kubenswrapper[4725]: E0227 06:35:25.698849 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2\": container with ID starting with de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2 not found: ID does not exist" containerID="de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.698879 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2"} err="failed to get container status \"de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2\": rpc error: code = NotFound desc = could not find container \"de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2\": container with ID starting with de639d2378da5a598b03bdfd873d91c3acdfb848737353cedecfbbcd3b2ce4a2 not found: ID does not exist" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.698901 4725 scope.go:117] "RemoveContainer" containerID="49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2" Feb 27 06:35:25 crc kubenswrapper[4725]: E0227 06:35:25.699265 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2\": container with ID starting with 49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2 not found: ID does not exist" containerID="49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.699325 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2"} err="failed to get container status \"49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2\": rpc error: code = NotFound desc = could not find container \"49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2\": container with ID starting with 49676da74a24ddd8314c99da652b65be7e315e13e302f1693cd74efdd8e273e2 not found: ID does not exist" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.872975 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.884363 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.924363 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:35:25 crc kubenswrapper[4725]: E0227 06:35:25.926056 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerName="rabbitmq" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.926083 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerName="rabbitmq" Feb 27 06:35:25 crc kubenswrapper[4725]: E0227 06:35:25.926155 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerName="setup-container" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.926165 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerName="setup-container" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.926726 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" containerName="rabbitmq" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.935471 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.939385 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.941492 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.941676 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.941760 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.941799 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.941939 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.942087 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 06:35:25 crc kubenswrapper[4725]: I0227 06:35:25.942206 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jkwst" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.109038 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113075 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5czpx\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-kube-api-access-5czpx\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113124 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113159 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4112b6c-11e8-4244-9a39-c7474ffd192b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113364 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113548 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113592 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4112b6c-11e8-4244-9a39-c7474ffd192b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113608 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113644 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113661 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113747 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.113895 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.215492 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-config-data\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.215611 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ac67077-5fb4-4890-98ba-f5280a08e464-erlang-cookie-secret\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.215648 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-plugins\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.215688 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdd8v\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-kube-api-access-rdd8v\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.215708 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-plugins-conf\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.215730 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-server-conf\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.215792 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.216389 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.216417 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-tls\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.216445 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ac67077-5fb4-4890-98ba-f5280a08e464-pod-info\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.216790 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.218408 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-erlang-cookie\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.218537 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-confd\") pod \"5ac67077-5fb4-4890-98ba-f5280a08e464\" (UID: \"5ac67077-5fb4-4890-98ba-f5280a08e464\") " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.218854 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.219017 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.219122 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5czpx\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-kube-api-access-5czpx\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.219172 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.219229 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4112b6c-11e8-4244-9a39-c7474ffd192b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.219384 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.219514 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.219759 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-kube-api-access-rdd8v" (OuterVolumeSpecName: "kube-api-access-rdd8v") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "kube-api-access-rdd8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.220700 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.220816 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.220859 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4112b6c-11e8-4244-9a39-c7474ffd192b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.220879 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.220912 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.220934 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.221018 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.221187 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.221200 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.221209 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdd8v\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-kube-api-access-rdd8v\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.221218 4725 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.221257 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.221731 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.222170 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.222316 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac67077-5fb4-4890-98ba-f5280a08e464-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.224134 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4112b6c-11e8-4244-9a39-c7474ffd192b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.224232 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.225566 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.226253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.226377 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5ac67077-5fb4-4890-98ba-f5280a08e464-pod-info" (OuterVolumeSpecName: "pod-info") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.228382 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.229893 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4112b6c-11e8-4244-9a39-c7474ffd192b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.237597 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5czpx\" (UniqueName: \"kubernetes.io/projected/e4112b6c-11e8-4244-9a39-c7474ffd192b-kube-api-access-5czpx\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.246888 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4112b6c-11e8-4244-9a39-c7474ffd192b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.269707 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb0a4e8-6f65-4961-9caa-18d66a6754af" path="/var/lib/kubelet/pods/fdb0a4e8-6f65-4961-9caa-18d66a6754af/volumes" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.273102 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e4112b6c-11e8-4244-9a39-c7474ffd192b\") " pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.275536 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-config-data" (OuterVolumeSpecName: "config-data") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.292967 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.313216 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-server-conf" (OuterVolumeSpecName: "server-conf") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.322806 4725 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.322850 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.322860 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.322871 4725 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5ac67077-5fb4-4890-98ba-f5280a08e464-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.322880 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ac67077-5fb4-4890-98ba-f5280a08e464-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.322888 4725 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5ac67077-5fb4-4890-98ba-f5280a08e464-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.347183 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.394891 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5ac67077-5fb4-4890-98ba-f5280a08e464" (UID: "5ac67077-5fb4-4890-98ba-f5280a08e464"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.426242 4725 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5ac67077-5fb4-4890-98ba-f5280a08e464-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.426273 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.538870 4725 generic.go:334] "Generic (PLEG): container finished" podID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerID="52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7" exitCode=0 Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.538923 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ac67077-5fb4-4890-98ba-f5280a08e464","Type":"ContainerDied","Data":"52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7"} Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.538955 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.538981 4725 scope.go:117] "RemoveContainer" containerID="52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.538964 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5ac67077-5fb4-4890-98ba-f5280a08e464","Type":"ContainerDied","Data":"59ae8d274721137a61e6ac4d02c469f52953a315578ae1694d6453e61e687ce7"} Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.581349 4725 scope.go:117] "RemoveContainer" containerID="4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.591197 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.616093 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.617986 4725 scope.go:117] "RemoveContainer" containerID="52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7" Feb 27 06:35:26 crc kubenswrapper[4725]: E0227 06:35:26.620319 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7\": container with ID starting with 52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7 not found: ID does not exist" containerID="52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.620355 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7"} err="failed to get container status \"52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7\": rpc error: code = NotFound desc = could not find container \"52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7\": container with ID starting with 52ccc3dd0437a12a21e0de8e45725da653f910f03fcff1c5b9e909fc3d7840a7 not found: ID does not exist" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.620376 4725 scope.go:117] "RemoveContainer" containerID="4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b" Feb 27 06:35:26 crc kubenswrapper[4725]: E0227 06:35:26.623527 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b\": container with ID starting with 4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b not found: ID does not exist" containerID="4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.623583 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b"} err="failed to get container status \"4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b\": rpc error: code = NotFound desc = could not find container \"4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b\": container with ID starting with 4552d215563093cbf16673b811fe35fee0f470102af57184d3c227d5d1bf5b2b not found: ID does not exist" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.623643 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:35:26 crc kubenswrapper[4725]: E0227 06:35:26.624152 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerName="setup-container" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.624179 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerName="setup-container" Feb 27 06:35:26 crc kubenswrapper[4725]: E0227 06:35:26.624220 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerName="rabbitmq" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.624228 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerName="rabbitmq" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.624532 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" containerName="rabbitmq" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.625954 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.628913 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.628967 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.629018 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.629102 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-g8rzr" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.629311 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.632627 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.633518 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.634197 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732087 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732162 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89e15a0f-61a2-4114-b1cc-385f54f886d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732396 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732593 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8lh\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-kube-api-access-wg8lh\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732675 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732761 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732814 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732866 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89e15a0f-61a2-4114-b1cc-385f54f886d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.732895 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.733003 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.733086 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: W0227 06:35:26.746818 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4112b6c_11e8_4244_9a39_c7474ffd192b.slice/crio-e642e6b5a1354a039eb5a730248e09f337a2718870c445edd132ea84bd82dcce WatchSource:0}: Error finding container e642e6b5a1354a039eb5a730248e09f337a2718870c445edd132ea84bd82dcce: Status 404 returned error can't find the container with id e642e6b5a1354a039eb5a730248e09f337a2718870c445edd132ea84bd82dcce Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.748806 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.834974 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8lh\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-kube-api-access-wg8lh\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835058 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835109 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835139 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835172 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89e15a0f-61a2-4114-b1cc-385f54f886d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835196 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835254 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835280 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835388 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835492 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89e15a0f-61a2-4114-b1cc-385f54f886d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835548 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.835805 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.836357 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.836457 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.836522 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.837253 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.838645 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89e15a0f-61a2-4114-b1cc-385f54f886d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.840095 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.841401 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89e15a0f-61a2-4114-b1cc-385f54f886d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.841819 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89e15a0f-61a2-4114-b1cc-385f54f886d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.846459 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.854982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8lh\" (UniqueName: \"kubernetes.io/projected/89e15a0f-61a2-4114-b1cc-385f54f886d3-kube-api-access-wg8lh\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.886965 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"89e15a0f-61a2-4114-b1cc-385f54f886d3\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:26 crc kubenswrapper[4725]: I0227 06:35:26.954817 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:35:27 crc kubenswrapper[4725]: I0227 06:35:27.469385 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 06:35:27 crc kubenswrapper[4725]: W0227 06:35:27.476712 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e15a0f_61a2_4114_b1cc_385f54f886d3.slice/crio-01245d73bdede0dcfb3d0f445047cf8a4db80b01034f141816022f3c24a6a239 WatchSource:0}: Error finding container 01245d73bdede0dcfb3d0f445047cf8a4db80b01034f141816022f3c24a6a239: Status 404 returned error can't find the container with id 01245d73bdede0dcfb3d0f445047cf8a4db80b01034f141816022f3c24a6a239 Feb 27 06:35:27 crc kubenswrapper[4725]: I0227 06:35:27.549269 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89e15a0f-61a2-4114-b1cc-385f54f886d3","Type":"ContainerStarted","Data":"01245d73bdede0dcfb3d0f445047cf8a4db80b01034f141816022f3c24a6a239"} Feb 27 06:35:27 crc kubenswrapper[4725]: I0227 06:35:27.550826 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e4112b6c-11e8-4244-9a39-c7474ffd192b","Type":"ContainerStarted","Data":"e642e6b5a1354a039eb5a730248e09f337a2718870c445edd132ea84bd82dcce"} Feb 27 06:35:28 crc kubenswrapper[4725]: I0227 06:35:28.262042 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac67077-5fb4-4890-98ba-f5280a08e464" path="/var/lib/kubelet/pods/5ac67077-5fb4-4890-98ba-f5280a08e464/volumes" Feb 27 06:35:28 crc kubenswrapper[4725]: I0227 06:35:28.566780 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e4112b6c-11e8-4244-9a39-c7474ffd192b","Type":"ContainerStarted","Data":"866fa46fd5598cdcbe333d94d521944105d986849bc7b60d0e05fd41bd9eae54"} Feb 27 06:35:29 crc kubenswrapper[4725]: I0227 06:35:29.586335 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89e15a0f-61a2-4114-b1cc-385f54f886d3","Type":"ContainerStarted","Data":"701a71632ecf7f58d49e9162e1cd85afed056b79218796bab3e2ed7c45ea2781"} Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.187008 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kznrh"] Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.190388 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.201014 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kznrh"] Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.337597 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-catalog-content\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.337637 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bdfx\" (UniqueName: \"kubernetes.io/projected/c7d62381-257d-431b-96a6-f1c46dc216f0-kube-api-access-4bdfx\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.337743 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-utilities\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.439277 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-utilities\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.439440 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-catalog-content\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.439462 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bdfx\" (UniqueName: \"kubernetes.io/projected/c7d62381-257d-431b-96a6-f1c46dc216f0-kube-api-access-4bdfx\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.439981 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-catalog-content\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.440249 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-utilities\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.477712 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bdfx\" (UniqueName: \"kubernetes.io/projected/c7d62381-257d-431b-96a6-f1c46dc216f0-kube-api-access-4bdfx\") pod \"redhat-operators-kznrh\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:31 crc kubenswrapper[4725]: I0227 06:35:31.525855 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.026132 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kznrh"] Feb 27 06:35:32 crc kubenswrapper[4725]: W0227 06:35:32.027977 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d62381_257d_431b_96a6_f1c46dc216f0.slice/crio-f713b6b3a54f9e13724347c6e936ccb17f7d3fa6667704af006789552d3a44cd WatchSource:0}: Error finding container f713b6b3a54f9e13724347c6e936ccb17f7d3fa6667704af006789552d3a44cd: Status 404 returned error can't find the container with id f713b6b3a54f9e13724347c6e936ccb17f7d3fa6667704af006789552d3a44cd Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.554491 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.554841 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.617902 4725 generic.go:334] "Generic (PLEG): container finished" podID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerID="b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41" exitCode=0 Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.617956 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kznrh" event={"ID":"c7d62381-257d-431b-96a6-f1c46dc216f0","Type":"ContainerDied","Data":"b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41"} Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.617986 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kznrh" event={"ID":"c7d62381-257d-431b-96a6-f1c46dc216f0","Type":"ContainerStarted","Data":"f713b6b3a54f9e13724347c6e936ccb17f7d3fa6667704af006789552d3a44cd"} Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.620097 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.972707 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd587cbc5-fh7w4"] Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.975754 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.977896 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 27 06:35:32 crc kubenswrapper[4725]: I0227 06:35:32.991772 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd587cbc5-fh7w4"] Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.068986 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkt6s\" (UniqueName: \"kubernetes.io/projected/4464d14b-8613-42c0-8e2f-5a473b61604b-kube-api-access-pkt6s\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.069079 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-config\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.069142 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.069237 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.069360 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-svc\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.069496 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.069530 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.171373 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.171422 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.171490 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkt6s\" (UniqueName: \"kubernetes.io/projected/4464d14b-8613-42c0-8e2f-5a473b61604b-kube-api-access-pkt6s\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.171580 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-config\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.171679 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.171722 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.171759 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-svc\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.174782 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.174814 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.174953 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-config\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.176277 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.176700 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.182247 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-svc\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.197662 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkt6s\" (UniqueName: \"kubernetes.io/projected/4464d14b-8613-42c0-8e2f-5a473b61604b-kube-api-access-pkt6s\") pod \"dnsmasq-dns-7fd587cbc5-fh7w4\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.291907 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.629928 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kznrh" event={"ID":"c7d62381-257d-431b-96a6-f1c46dc216f0","Type":"ContainerStarted","Data":"a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75"} Feb 27 06:35:33 crc kubenswrapper[4725]: I0227 06:35:33.787598 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd587cbc5-fh7w4"] Feb 27 06:35:34 crc kubenswrapper[4725]: I0227 06:35:34.641950 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" event={"ID":"4464d14b-8613-42c0-8e2f-5a473b61604b","Type":"ContainerStarted","Data":"eebc16d64120acbb0ba89e85d1fff82b9c11f20d4c129f7aea0f50f0d3720d33"} Feb 27 06:35:34 crc kubenswrapper[4725]: I0227 06:35:34.642320 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" event={"ID":"4464d14b-8613-42c0-8e2f-5a473b61604b","Type":"ContainerStarted","Data":"36633e992afb2840b564e9ca2d562d4026d66d49296391a5046182994c2447f4"} Feb 27 06:35:35 crc kubenswrapper[4725]: I0227 06:35:35.654277 4725 generic.go:334] "Generic (PLEG): container finished" podID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerID="a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75" exitCode=0 Feb 27 06:35:35 crc kubenswrapper[4725]: I0227 06:35:35.654441 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kznrh" event={"ID":"c7d62381-257d-431b-96a6-f1c46dc216f0","Type":"ContainerDied","Data":"a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75"} Feb 27 06:35:35 crc kubenswrapper[4725]: I0227 06:35:35.656404 4725 generic.go:334] "Generic (PLEG): container finished" podID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerID="eebc16d64120acbb0ba89e85d1fff82b9c11f20d4c129f7aea0f50f0d3720d33" exitCode=0 Feb 27 06:35:35 crc kubenswrapper[4725]: I0227 06:35:35.656433 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" event={"ID":"4464d14b-8613-42c0-8e2f-5a473b61604b","Type":"ContainerDied","Data":"eebc16d64120acbb0ba89e85d1fff82b9c11f20d4c129f7aea0f50f0d3720d33"} Feb 27 06:35:36 crc kubenswrapper[4725]: I0227 06:35:36.672935 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" event={"ID":"4464d14b-8613-42c0-8e2f-5a473b61604b","Type":"ContainerStarted","Data":"070efbb24c385c46d32dec37f8932cef76a0fd091325ed2807f8e95b2ee17b07"} Feb 27 06:35:36 crc kubenswrapper[4725]: I0227 06:35:36.673303 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:36 crc kubenswrapper[4725]: I0227 06:35:36.703755 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" podStartSLOduration=4.703734853 podStartE2EDuration="4.703734853s" podCreationTimestamp="2026-02-27 06:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:35:36.692638721 +0000 UTC m=+1515.155259340" watchObservedRunningTime="2026-02-27 06:35:36.703734853 +0000 UTC m=+1515.166355432" Feb 27 06:35:37 crc kubenswrapper[4725]: I0227 06:35:37.687561 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kznrh" event={"ID":"c7d62381-257d-431b-96a6-f1c46dc216f0","Type":"ContainerStarted","Data":"18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665"} Feb 27 06:35:37 crc kubenswrapper[4725]: I0227 06:35:37.726845 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kznrh" podStartSLOduration=2.495426424 podStartE2EDuration="6.726822983s" podCreationTimestamp="2026-02-27 06:35:31 +0000 UTC" firstStartedPulling="2026-02-27 06:35:32.61986533 +0000 UTC m=+1511.082485899" lastFinishedPulling="2026-02-27 06:35:36.851261879 +0000 UTC m=+1515.313882458" observedRunningTime="2026-02-27 06:35:37.715740631 +0000 UTC m=+1516.178361210" watchObservedRunningTime="2026-02-27 06:35:37.726822983 +0000 UTC m=+1516.189443552" Feb 27 06:35:41 crc kubenswrapper[4725]: I0227 06:35:41.526338 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:41 crc kubenswrapper[4725]: I0227 06:35:41.527182 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:42 crc kubenswrapper[4725]: I0227 06:35:42.576215 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kznrh" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="registry-server" probeResult="failure" output=< Feb 27 06:35:42 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:35:42 crc kubenswrapper[4725]: > Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.293508 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.384203 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59749476c-fklwv"] Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.384541 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59749476c-fklwv" podUID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerName="dnsmasq-dns" containerID="cri-o://8595668007a4d363f6370965bf76d4cdb0329ca854fa09cb5feb75235fd4c35a" gracePeriod=10 Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.629175 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-584644fbc5-9wt8c"] Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.640296 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.682350 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-584644fbc5-9wt8c"] Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.703942 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-ovsdbserver-sb\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.704045 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zhg\" (UniqueName: \"kubernetes.io/projected/77a07f13-4e0b-4d51-9e35-2787348e7a63-kube-api-access-w5zhg\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.704068 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-ovsdbserver-nb\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.704097 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-dns-swift-storage-0\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.704126 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-config\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.704141 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-openstack-edpm-ipam\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.704166 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-dns-svc\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.787231 4725 generic.go:334] "Generic (PLEG): container finished" podID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerID="8595668007a4d363f6370965bf76d4cdb0329ca854fa09cb5feb75235fd4c35a" exitCode=0 Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.787272 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59749476c-fklwv" event={"ID":"c012782a-1d54-4605-9d08-4ffacc6dc1a1","Type":"ContainerDied","Data":"8595668007a4d363f6370965bf76d4cdb0329ca854fa09cb5feb75235fd4c35a"} Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.807998 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zhg\" (UniqueName: \"kubernetes.io/projected/77a07f13-4e0b-4d51-9e35-2787348e7a63-kube-api-access-w5zhg\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.808064 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-ovsdbserver-nb\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.808109 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-dns-swift-storage-0\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.808152 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-openstack-edpm-ipam\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.808174 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-config\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.808211 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-dns-svc\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.808269 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-ovsdbserver-sb\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.809342 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-ovsdbserver-sb\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.811039 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-ovsdbserver-nb\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.811207 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-openstack-edpm-ipam\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.811975 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-dns-swift-storage-0\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.812184 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-dns-svc\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.812197 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a07f13-4e0b-4d51-9e35-2787348e7a63-config\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.840697 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zhg\" (UniqueName: \"kubernetes.io/projected/77a07f13-4e0b-4d51-9e35-2787348e7a63-kube-api-access-w5zhg\") pod \"dnsmasq-dns-584644fbc5-9wt8c\" (UID: \"77a07f13-4e0b-4d51-9e35-2787348e7a63\") " pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:43 crc kubenswrapper[4725]: I0227 06:35:43.968918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.102188 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.121596 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-swift-storage-0\") pod \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.121682 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-config\") pod \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.121739 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-svc\") pod \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.121765 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4s6r\" (UniqueName: \"kubernetes.io/projected/c012782a-1d54-4605-9d08-4ffacc6dc1a1-kube-api-access-x4s6r\") pod \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.121831 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-sb\") pod \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.121999 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-nb\") pod \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\" (UID: \"c012782a-1d54-4605-9d08-4ffacc6dc1a1\") " Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.128034 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c012782a-1d54-4605-9d08-4ffacc6dc1a1-kube-api-access-x4s6r" (OuterVolumeSpecName: "kube-api-access-x4s6r") pod "c012782a-1d54-4605-9d08-4ffacc6dc1a1" (UID: "c012782a-1d54-4605-9d08-4ffacc6dc1a1"). InnerVolumeSpecName "kube-api-access-x4s6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.182122 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c012782a-1d54-4605-9d08-4ffacc6dc1a1" (UID: "c012782a-1d54-4605-9d08-4ffacc6dc1a1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.191650 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c012782a-1d54-4605-9d08-4ffacc6dc1a1" (UID: "c012782a-1d54-4605-9d08-4ffacc6dc1a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.207721 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c012782a-1d54-4605-9d08-4ffacc6dc1a1" (UID: "c012782a-1d54-4605-9d08-4ffacc6dc1a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.219259 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c012782a-1d54-4605-9d08-4ffacc6dc1a1" (UID: "c012782a-1d54-4605-9d08-4ffacc6dc1a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.224663 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.224708 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.224721 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4s6r\" (UniqueName: \"kubernetes.io/projected/c012782a-1d54-4605-9d08-4ffacc6dc1a1-kube-api-access-x4s6r\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.224789 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.224803 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.239169 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-config" (OuterVolumeSpecName: "config") pod "c012782a-1d54-4605-9d08-4ffacc6dc1a1" (UID: "c012782a-1d54-4605-9d08-4ffacc6dc1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.327519 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c012782a-1d54-4605-9d08-4ffacc6dc1a1-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.484647 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-584644fbc5-9wt8c"] Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.797733 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59749476c-fklwv" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.797736 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59749476c-fklwv" event={"ID":"c012782a-1d54-4605-9d08-4ffacc6dc1a1","Type":"ContainerDied","Data":"df041be80598e27404afb6b1c66ad838d9051c661fdb37109d4030d31b1b6ecf"} Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.798080 4725 scope.go:117] "RemoveContainer" containerID="8595668007a4d363f6370965bf76d4cdb0329ca854fa09cb5feb75235fd4c35a" Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.798827 4725 generic.go:334] "Generic (PLEG): container finished" podID="77a07f13-4e0b-4d51-9e35-2787348e7a63" containerID="4b26fc8d43e4370e650712d357a0e1957ab333aa1cf0e77d9458a80ed61e1bb1" exitCode=0 Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.798852 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" event={"ID":"77a07f13-4e0b-4d51-9e35-2787348e7a63","Type":"ContainerDied","Data":"4b26fc8d43e4370e650712d357a0e1957ab333aa1cf0e77d9458a80ed61e1bb1"} Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.798866 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" event={"ID":"77a07f13-4e0b-4d51-9e35-2787348e7a63","Type":"ContainerStarted","Data":"198dd95c981f0ece3bfa135d470a5a57de7ce4a45d7aa30b7cbfe4e812dfc4a3"} Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.828105 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59749476c-fklwv"] Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.843605 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59749476c-fklwv"] Feb 27 06:35:44 crc kubenswrapper[4725]: I0227 06:35:44.926278 4725 scope.go:117] "RemoveContainer" containerID="d3a618062f19f9c10ae587093c2976f04668bc73273b71bb5784fcf3686f4a81" Feb 27 06:35:45 crc kubenswrapper[4725]: I0227 06:35:45.808951 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" event={"ID":"77a07f13-4e0b-4d51-9e35-2787348e7a63","Type":"ContainerStarted","Data":"412677a61211441ff0590bc6de2c964b3ad2cf10ceab70d1e231131b7f62b121"} Feb 27 06:35:45 crc kubenswrapper[4725]: I0227 06:35:45.809487 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:45 crc kubenswrapper[4725]: I0227 06:35:45.831538 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" podStartSLOduration=2.831519504 podStartE2EDuration="2.831519504s" podCreationTimestamp="2026-02-27 06:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:35:45.825321649 +0000 UTC m=+1524.287942238" watchObservedRunningTime="2026-02-27 06:35:45.831519504 +0000 UTC m=+1524.294140073" Feb 27 06:35:46 crc kubenswrapper[4725]: I0227 06:35:46.268360 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" path="/var/lib/kubelet/pods/c012782a-1d54-4605-9d08-4ffacc6dc1a1/volumes" Feb 27 06:35:51 crc kubenswrapper[4725]: I0227 06:35:51.585456 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:51 crc kubenswrapper[4725]: I0227 06:35:51.661140 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:51 crc kubenswrapper[4725]: I0227 06:35:51.842769 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kznrh"] Feb 27 06:35:52 crc kubenswrapper[4725]: I0227 06:35:52.895159 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kznrh" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="registry-server" containerID="cri-o://18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665" gracePeriod=2 Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.385525 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.551789 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-utilities\") pod \"c7d62381-257d-431b-96a6-f1c46dc216f0\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.551862 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-catalog-content\") pod \"c7d62381-257d-431b-96a6-f1c46dc216f0\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.552023 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bdfx\" (UniqueName: \"kubernetes.io/projected/c7d62381-257d-431b-96a6-f1c46dc216f0-kube-api-access-4bdfx\") pod \"c7d62381-257d-431b-96a6-f1c46dc216f0\" (UID: \"c7d62381-257d-431b-96a6-f1c46dc216f0\") " Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.552870 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-utilities" (OuterVolumeSpecName: "utilities") pod "c7d62381-257d-431b-96a6-f1c46dc216f0" (UID: "c7d62381-257d-431b-96a6-f1c46dc216f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.560315 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d62381-257d-431b-96a6-f1c46dc216f0-kube-api-access-4bdfx" (OuterVolumeSpecName: "kube-api-access-4bdfx") pod "c7d62381-257d-431b-96a6-f1c46dc216f0" (UID: "c7d62381-257d-431b-96a6-f1c46dc216f0"). InnerVolumeSpecName "kube-api-access-4bdfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.655699 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.655752 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bdfx\" (UniqueName: \"kubernetes.io/projected/c7d62381-257d-431b-96a6-f1c46dc216f0-kube-api-access-4bdfx\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.682938 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7d62381-257d-431b-96a6-f1c46dc216f0" (UID: "c7d62381-257d-431b-96a6-f1c46dc216f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.757758 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d62381-257d-431b-96a6-f1c46dc216f0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.909071 4725 generic.go:334] "Generic (PLEG): container finished" podID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerID="18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665" exitCode=0 Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.909158 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kznrh" event={"ID":"c7d62381-257d-431b-96a6-f1c46dc216f0","Type":"ContainerDied","Data":"18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665"} Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.909193 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kznrh" event={"ID":"c7d62381-257d-431b-96a6-f1c46dc216f0","Type":"ContainerDied","Data":"f713b6b3a54f9e13724347c6e936ccb17f7d3fa6667704af006789552d3a44cd"} Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.909215 4725 scope.go:117] "RemoveContainer" containerID="18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.909419 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kznrh" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.937662 4725 scope.go:117] "RemoveContainer" containerID="a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.963430 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kznrh"] Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.971030 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-584644fbc5-9wt8c" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.971202 4725 scope.go:117] "RemoveContainer" containerID="b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41" Feb 27 06:35:53 crc kubenswrapper[4725]: I0227 06:35:53.979920 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kznrh"] Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.055759 4725 scope.go:117] "RemoveContainer" containerID="18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665" Feb 27 06:35:54 crc kubenswrapper[4725]: E0227 06:35:54.056664 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665\": container with ID starting with 18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665 not found: ID does not exist" containerID="18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.056699 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665"} err="failed to get container status \"18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665\": rpc error: code = NotFound desc = could not find container \"18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665\": container with ID starting with 18538de1ad861bd775e0374f403f4942dd03de13bd125008d0bb3717ff065665 not found: ID does not exist" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.056727 4725 scope.go:117] "RemoveContainer" containerID="a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.059719 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd587cbc5-fh7w4"] Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.059937 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" podUID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerName="dnsmasq-dns" containerID="cri-o://070efbb24c385c46d32dec37f8932cef76a0fd091325ed2807f8e95b2ee17b07" gracePeriod=10 Feb 27 06:35:54 crc kubenswrapper[4725]: E0227 06:35:54.065115 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75\": container with ID starting with a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75 not found: ID does not exist" containerID="a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.065164 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75"} err="failed to get container status \"a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75\": rpc error: code = NotFound desc = could not find container \"a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75\": container with ID starting with a0b7acb7df6da97663d806c2086f12249f747b062e441379b3fa85b1e085ea75 not found: ID does not exist" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.065195 4725 scope.go:117] "RemoveContainer" containerID="b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41" Feb 27 06:35:54 crc kubenswrapper[4725]: E0227 06:35:54.072085 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41\": container with ID starting with b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41 not found: ID does not exist" containerID="b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.072126 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41"} err="failed to get container status \"b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41\": rpc error: code = NotFound desc = could not find container \"b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41\": container with ID starting with b34ac045df1d6bcb0755f099a4777740dfb354e39ed4c4d2f478f296de38cb41 not found: ID does not exist" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.266855 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" path="/var/lib/kubelet/pods/c7d62381-257d-431b-96a6-f1c46dc216f0/volumes" Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.925102 4725 generic.go:334] "Generic (PLEG): container finished" podID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerID="070efbb24c385c46d32dec37f8932cef76a0fd091325ed2807f8e95b2ee17b07" exitCode=0 Feb 27 06:35:54 crc kubenswrapper[4725]: I0227 06:35:54.925561 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" event={"ID":"4464d14b-8613-42c0-8e2f-5a473b61604b","Type":"ContainerDied","Data":"070efbb24c385c46d32dec37f8932cef76a0fd091325ed2807f8e95b2ee17b07"} Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.039326 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.198261 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkt6s\" (UniqueName: \"kubernetes.io/projected/4464d14b-8613-42c0-8e2f-5a473b61604b-kube-api-access-pkt6s\") pod \"4464d14b-8613-42c0-8e2f-5a473b61604b\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.198378 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-swift-storage-0\") pod \"4464d14b-8613-42c0-8e2f-5a473b61604b\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.198657 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-nb\") pod \"4464d14b-8613-42c0-8e2f-5a473b61604b\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.198723 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-openstack-edpm-ipam\") pod \"4464d14b-8613-42c0-8e2f-5a473b61604b\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.199382 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-config\") pod \"4464d14b-8613-42c0-8e2f-5a473b61604b\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.199451 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-sb\") pod \"4464d14b-8613-42c0-8e2f-5a473b61604b\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.199509 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-svc\") pod \"4464d14b-8613-42c0-8e2f-5a473b61604b\" (UID: \"4464d14b-8613-42c0-8e2f-5a473b61604b\") " Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.204941 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4464d14b-8613-42c0-8e2f-5a473b61604b-kube-api-access-pkt6s" (OuterVolumeSpecName: "kube-api-access-pkt6s") pod "4464d14b-8613-42c0-8e2f-5a473b61604b" (UID: "4464d14b-8613-42c0-8e2f-5a473b61604b"). InnerVolumeSpecName "kube-api-access-pkt6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.253917 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4464d14b-8613-42c0-8e2f-5a473b61604b" (UID: "4464d14b-8613-42c0-8e2f-5a473b61604b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.266828 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4464d14b-8613-42c0-8e2f-5a473b61604b" (UID: "4464d14b-8613-42c0-8e2f-5a473b61604b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.271739 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "4464d14b-8613-42c0-8e2f-5a473b61604b" (UID: "4464d14b-8613-42c0-8e2f-5a473b61604b"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.275456 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4464d14b-8613-42c0-8e2f-5a473b61604b" (UID: "4464d14b-8613-42c0-8e2f-5a473b61604b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.280323 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4464d14b-8613-42c0-8e2f-5a473b61604b" (UID: "4464d14b-8613-42c0-8e2f-5a473b61604b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.291467 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-config" (OuterVolumeSpecName: "config") pod "4464d14b-8613-42c0-8e2f-5a473b61604b" (UID: "4464d14b-8613-42c0-8e2f-5a473b61604b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.301910 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkt6s\" (UniqueName: \"kubernetes.io/projected/4464d14b-8613-42c0-8e2f-5a473b61604b-kube-api-access-pkt6s\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.302032 4725 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.302043 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.302051 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.302061 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.302070 4725 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.302087 4725 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4464d14b-8613-42c0-8e2f-5a473b61604b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.939694 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" event={"ID":"4464d14b-8613-42c0-8e2f-5a473b61604b","Type":"ContainerDied","Data":"36633e992afb2840b564e9ca2d562d4026d66d49296391a5046182994c2447f4"} Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.940018 4725 scope.go:117] "RemoveContainer" containerID="070efbb24c385c46d32dec37f8932cef76a0fd091325ed2807f8e95b2ee17b07" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.939743 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd587cbc5-fh7w4" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.962782 4725 scope.go:117] "RemoveContainer" containerID="eebc16d64120acbb0ba89e85d1fff82b9c11f20d4c129f7aea0f50f0d3720d33" Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.974233 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd587cbc5-fh7w4"] Feb 27 06:35:55 crc kubenswrapper[4725]: I0227 06:35:55.983346 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd587cbc5-fh7w4"] Feb 27 06:35:56 crc kubenswrapper[4725]: I0227 06:35:56.271681 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4464d14b-8613-42c0-8e2f-5a473b61604b" path="/var/lib/kubelet/pods/4464d14b-8613-42c0-8e2f-5a473b61604b/volumes" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.140360 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536236-6c6mp"] Feb 27 06:36:00 crc kubenswrapper[4725]: E0227 06:36:00.141963 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerName="dnsmasq-dns" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.141994 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerName="dnsmasq-dns" Feb 27 06:36:00 crc kubenswrapper[4725]: E0227 06:36:00.142020 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerName="init" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142033 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerName="init" Feb 27 06:36:00 crc kubenswrapper[4725]: E0227 06:36:00.142065 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="registry-server" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142079 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="registry-server" Feb 27 06:36:00 crc kubenswrapper[4725]: E0227 06:36:00.142100 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="extract-content" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142114 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="extract-content" Feb 27 06:36:00 crc kubenswrapper[4725]: E0227 06:36:00.142148 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="extract-utilities" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142160 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="extract-utilities" Feb 27 06:36:00 crc kubenswrapper[4725]: E0227 06:36:00.142197 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerName="dnsmasq-dns" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142208 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerName="dnsmasq-dns" Feb 27 06:36:00 crc kubenswrapper[4725]: E0227 06:36:00.142224 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerName="init" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142235 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerName="init" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142643 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d62381-257d-431b-96a6-f1c46dc216f0" containerName="registry-server" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142699 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4464d14b-8613-42c0-8e2f-5a473b61604b" containerName="dnsmasq-dns" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.142721 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c012782a-1d54-4605-9d08-4ffacc6dc1a1" containerName="dnsmasq-dns" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.143896 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536236-6c6mp" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.146744 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.147202 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.147381 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.152972 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536236-6c6mp"] Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.208880 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hsfs\" (UniqueName: \"kubernetes.io/projected/5eef3217-3389-48e0-8aa1-e6017e20258d-kube-api-access-6hsfs\") pod \"auto-csr-approver-29536236-6c6mp\" (UID: \"5eef3217-3389-48e0-8aa1-e6017e20258d\") " pod="openshift-infra/auto-csr-approver-29536236-6c6mp" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.313028 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hsfs\" (UniqueName: \"kubernetes.io/projected/5eef3217-3389-48e0-8aa1-e6017e20258d-kube-api-access-6hsfs\") pod \"auto-csr-approver-29536236-6c6mp\" (UID: \"5eef3217-3389-48e0-8aa1-e6017e20258d\") " pod="openshift-infra/auto-csr-approver-29536236-6c6mp" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.341664 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hsfs\" (UniqueName: \"kubernetes.io/projected/5eef3217-3389-48e0-8aa1-e6017e20258d-kube-api-access-6hsfs\") pod \"auto-csr-approver-29536236-6c6mp\" (UID: \"5eef3217-3389-48e0-8aa1-e6017e20258d\") " pod="openshift-infra/auto-csr-approver-29536236-6c6mp" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.467692 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536236-6c6mp" Feb 27 06:36:00 crc kubenswrapper[4725]: I0227 06:36:00.929215 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536236-6c6mp"] Feb 27 06:36:01 crc kubenswrapper[4725]: I0227 06:36:01.007042 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536236-6c6mp" event={"ID":"5eef3217-3389-48e0-8aa1-e6017e20258d","Type":"ContainerStarted","Data":"1cca58f8ecec1f08bb06a2f205c03c91253a70171169f05d03ba8eaa22c957fd"} Feb 27 06:36:02 crc kubenswrapper[4725]: I0227 06:36:02.024856 4725 generic.go:334] "Generic (PLEG): container finished" podID="89e15a0f-61a2-4114-b1cc-385f54f886d3" containerID="701a71632ecf7f58d49e9162e1cd85afed056b79218796bab3e2ed7c45ea2781" exitCode=0 Feb 27 06:36:02 crc kubenswrapper[4725]: I0227 06:36:02.024940 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89e15a0f-61a2-4114-b1cc-385f54f886d3","Type":"ContainerDied","Data":"701a71632ecf7f58d49e9162e1cd85afed056b79218796bab3e2ed7c45ea2781"} Feb 27 06:36:02 crc kubenswrapper[4725]: I0227 06:36:02.029668 4725 generic.go:334] "Generic (PLEG): container finished" podID="e4112b6c-11e8-4244-9a39-c7474ffd192b" containerID="866fa46fd5598cdcbe333d94d521944105d986849bc7b60d0e05fd41bd9eae54" exitCode=0 Feb 27 06:36:02 crc kubenswrapper[4725]: I0227 06:36:02.029721 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e4112b6c-11e8-4244-9a39-c7474ffd192b","Type":"ContainerDied","Data":"866fa46fd5598cdcbe333d94d521944105d986849bc7b60d0e05fd41bd9eae54"} Feb 27 06:36:02 crc kubenswrapper[4725]: I0227 06:36:02.554105 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:36:02 crc kubenswrapper[4725]: I0227 06:36:02.554464 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.039534 4725 generic.go:334] "Generic (PLEG): container finished" podID="5eef3217-3389-48e0-8aa1-e6017e20258d" containerID="c98c707a97476f9732720cd0fa8f34bbf5dff542eff9b9b66ded3a073e396187" exitCode=0 Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.039646 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536236-6c6mp" event={"ID":"5eef3217-3389-48e0-8aa1-e6017e20258d","Type":"ContainerDied","Data":"c98c707a97476f9732720cd0fa8f34bbf5dff542eff9b9b66ded3a073e396187"} Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.042307 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89e15a0f-61a2-4114-b1cc-385f54f886d3","Type":"ContainerStarted","Data":"e74809362c748accdce20ca595a96fbf609fd16779bf2a1e8b67cb2b1f8dbaa6"} Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.042785 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.045121 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e4112b6c-11e8-4244-9a39-c7474ffd192b","Type":"ContainerStarted","Data":"b43473003e0bae399db72a85d9da0ca44e9f54ed3a0f29d7f948639871c4bf3c"} Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.045331 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.075978 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.075957133 podStartE2EDuration="37.075957133s" podCreationTimestamp="2026-02-27 06:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:36:03.07445622 +0000 UTC m=+1541.537076829" watchObservedRunningTime="2026-02-27 06:36:03.075957133 +0000 UTC m=+1541.538577702" Feb 27 06:36:03 crc kubenswrapper[4725]: I0227 06:36:03.116108 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.116085293 podStartE2EDuration="38.116085293s" podCreationTimestamp="2026-02-27 06:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:36:03.101708058 +0000 UTC m=+1541.564328637" watchObservedRunningTime="2026-02-27 06:36:03.116085293 +0000 UTC m=+1541.578705892" Feb 27 06:36:04 crc kubenswrapper[4725]: I0227 06:36:04.429274 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536236-6c6mp" Feb 27 06:36:04 crc kubenswrapper[4725]: I0227 06:36:04.507656 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hsfs\" (UniqueName: \"kubernetes.io/projected/5eef3217-3389-48e0-8aa1-e6017e20258d-kube-api-access-6hsfs\") pod \"5eef3217-3389-48e0-8aa1-e6017e20258d\" (UID: \"5eef3217-3389-48e0-8aa1-e6017e20258d\") " Feb 27 06:36:04 crc kubenswrapper[4725]: I0227 06:36:04.515671 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eef3217-3389-48e0-8aa1-e6017e20258d-kube-api-access-6hsfs" (OuterVolumeSpecName: "kube-api-access-6hsfs") pod "5eef3217-3389-48e0-8aa1-e6017e20258d" (UID: "5eef3217-3389-48e0-8aa1-e6017e20258d"). InnerVolumeSpecName "kube-api-access-6hsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:36:04 crc kubenswrapper[4725]: I0227 06:36:04.611345 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hsfs\" (UniqueName: \"kubernetes.io/projected/5eef3217-3389-48e0-8aa1-e6017e20258d-kube-api-access-6hsfs\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:05 crc kubenswrapper[4725]: I0227 06:36:05.096403 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536236-6c6mp" event={"ID":"5eef3217-3389-48e0-8aa1-e6017e20258d","Type":"ContainerDied","Data":"1cca58f8ecec1f08bb06a2f205c03c91253a70171169f05d03ba8eaa22c957fd"} Feb 27 06:36:05 crc kubenswrapper[4725]: I0227 06:36:05.096454 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cca58f8ecec1f08bb06a2f205c03c91253a70171169f05d03ba8eaa22c957fd" Feb 27 06:36:05 crc kubenswrapper[4725]: I0227 06:36:05.096523 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536236-6c6mp" Feb 27 06:36:05 crc kubenswrapper[4725]: I0227 06:36:05.501160 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536230-wfrlq"] Feb 27 06:36:05 crc kubenswrapper[4725]: I0227 06:36:05.513887 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536230-wfrlq"] Feb 27 06:36:06 crc kubenswrapper[4725]: I0227 06:36:06.265231 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f015d17a-1381-410a-92d4-a28b0a4a4b1b" path="/var/lib/kubelet/pods/f015d17a-1381-410a-92d4-a28b0a4a4b1b/volumes" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.058903 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g"] Feb 27 06:36:11 crc kubenswrapper[4725]: E0227 06:36:11.064515 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eef3217-3389-48e0-8aa1-e6017e20258d" containerName="oc" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.064542 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eef3217-3389-48e0-8aa1-e6017e20258d" containerName="oc" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.065113 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eef3217-3389-48e0-8aa1-e6017e20258d" containerName="oc" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.066432 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.075313 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g"] Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.098495 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.098530 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.098813 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.099003 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.150247 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnqjx\" (UniqueName: \"kubernetes.io/projected/a5ce3d2f-4b00-4971-a37f-3217fd19665a-kube-api-access-qnqjx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.150588 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.150650 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.150693 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.252511 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.252661 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnqjx\" (UniqueName: \"kubernetes.io/projected/a5ce3d2f-4b00-4971-a37f-3217fd19665a-kube-api-access-qnqjx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.252729 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.252783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.258556 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.259496 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.263685 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.269796 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnqjx\" (UniqueName: \"kubernetes.io/projected/a5ce3d2f-4b00-4971-a37f-3217fd19665a-kube-api-access-qnqjx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:11 crc kubenswrapper[4725]: I0227 06:36:11.411478 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:12 crc kubenswrapper[4725]: I0227 06:36:12.138541 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g"] Feb 27 06:36:12 crc kubenswrapper[4725]: W0227 06:36:12.144592 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ce3d2f_4b00_4971_a37f_3217fd19665a.slice/crio-97dfac03c06889bab997cb5ce81722bc3a27a7ecd0acb66c83fc511cdd7e77f3 WatchSource:0}: Error finding container 97dfac03c06889bab997cb5ce81722bc3a27a7ecd0acb66c83fc511cdd7e77f3: Status 404 returned error can't find the container with id 97dfac03c06889bab997cb5ce81722bc3a27a7ecd0acb66c83fc511cdd7e77f3 Feb 27 06:36:12 crc kubenswrapper[4725]: I0227 06:36:12.178757 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" event={"ID":"a5ce3d2f-4b00-4971-a37f-3217fd19665a","Type":"ContainerStarted","Data":"97dfac03c06889bab997cb5ce81722bc3a27a7ecd0acb66c83fc511cdd7e77f3"} Feb 27 06:36:16 crc kubenswrapper[4725]: I0227 06:36:16.298434 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e4112b6c-11e8-4244-9a39-c7474ffd192b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.235:5671: connect: connection refused" Feb 27 06:36:16 crc kubenswrapper[4725]: I0227 06:36:16.957806 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="89e15a0f-61a2-4114-b1cc-385f54f886d3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.236:5671: connect: connection refused" Feb 27 06:36:24 crc kubenswrapper[4725]: I0227 06:36:24.875544 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ht56r"] Feb 27 06:36:24 crc kubenswrapper[4725]: I0227 06:36:24.878524 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:24 crc kubenswrapper[4725]: I0227 06:36:24.896346 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ht56r"] Feb 27 06:36:24 crc kubenswrapper[4725]: I0227 06:36:24.946424 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pgd\" (UniqueName: \"kubernetes.io/projected/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-kube-api-access-74pgd\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:24 crc kubenswrapper[4725]: I0227 06:36:24.946473 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-catalog-content\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:24 crc kubenswrapper[4725]: I0227 06:36:24.946574 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-utilities\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.049062 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-utilities\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.049303 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74pgd\" (UniqueName: \"kubernetes.io/projected/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-kube-api-access-74pgd\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.049375 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-catalog-content\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.049711 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-utilities\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.050586 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-catalog-content\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.070609 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pgd\" (UniqueName: \"kubernetes.io/projected/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-kube-api-access-74pgd\") pod \"community-operators-ht56r\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.202138 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:25 crc kubenswrapper[4725]: I0227 06:36:25.752993 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:36:26 crc kubenswrapper[4725]: I0227 06:36:26.239571 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ht56r"] Feb 27 06:36:26 crc kubenswrapper[4725]: W0227 06:36:26.240488 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc1a3b3_6c9d_44c9_a5ed_81e4b6ef3037.slice/crio-bd89e0521f8a8158be3fb5ebdf5612661a8133931d7bcaaf24c711192f11f335 WatchSource:0}: Error finding container bd89e0521f8a8158be3fb5ebdf5612661a8133931d7bcaaf24c711192f11f335: Status 404 returned error can't find the container with id bd89e0521f8a8158be3fb5ebdf5612661a8133931d7bcaaf24c711192f11f335 Feb 27 06:36:26 crc kubenswrapper[4725]: I0227 06:36:26.295457 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 06:36:26 crc kubenswrapper[4725]: I0227 06:36:26.419588 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht56r" event={"ID":"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037","Type":"ContainerStarted","Data":"bd89e0521f8a8158be3fb5ebdf5612661a8133931d7bcaaf24c711192f11f335"} Feb 27 06:36:26 crc kubenswrapper[4725]: I0227 06:36:26.958582 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 06:36:27 crc kubenswrapper[4725]: I0227 06:36:27.430144 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" event={"ID":"a5ce3d2f-4b00-4971-a37f-3217fd19665a","Type":"ContainerStarted","Data":"fcdc88e605c4a536b19a69f3d5fe7df177ca1c7a71b0d701b6eb86da535f8b76"} Feb 27 06:36:28 crc kubenswrapper[4725]: I0227 06:36:28.441814 4725 generic.go:334] "Generic (PLEG): container finished" podID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerID="b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b" exitCode=0 Feb 27 06:36:28 crc kubenswrapper[4725]: I0227 06:36:28.441871 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht56r" event={"ID":"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037","Type":"ContainerDied","Data":"b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b"} Feb 27 06:36:28 crc kubenswrapper[4725]: I0227 06:36:28.480410 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" podStartSLOduration=3.880347905 podStartE2EDuration="17.48038036s" podCreationTimestamp="2026-02-27 06:36:11 +0000 UTC" firstStartedPulling="2026-02-27 06:36:12.147422338 +0000 UTC m=+1550.610042907" lastFinishedPulling="2026-02-27 06:36:25.747454783 +0000 UTC m=+1564.210075362" observedRunningTime="2026-02-27 06:36:28.478374203 +0000 UTC m=+1566.940994862" watchObservedRunningTime="2026-02-27 06:36:28.48038036 +0000 UTC m=+1566.943000999" Feb 27 06:36:30 crc kubenswrapper[4725]: I0227 06:36:30.464674 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht56r" event={"ID":"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037","Type":"ContainerStarted","Data":"f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7"} Feb 27 06:36:32 crc kubenswrapper[4725]: I0227 06:36:32.485279 4725 generic.go:334] "Generic (PLEG): container finished" podID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerID="f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7" exitCode=0 Feb 27 06:36:32 crc kubenswrapper[4725]: I0227 06:36:32.485529 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht56r" event={"ID":"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037","Type":"ContainerDied","Data":"f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7"} Feb 27 06:36:32 crc kubenswrapper[4725]: I0227 06:36:32.554631 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:36:32 crc kubenswrapper[4725]: I0227 06:36:32.554704 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:36:32 crc kubenswrapper[4725]: I0227 06:36:32.554762 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:36:32 crc kubenswrapper[4725]: I0227 06:36:32.556310 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:36:32 crc kubenswrapper[4725]: I0227 06:36:32.556388 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" gracePeriod=600 Feb 27 06:36:32 crc kubenswrapper[4725]: E0227 06:36:32.686454 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:36:33 crc kubenswrapper[4725]: I0227 06:36:33.498695 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht56r" event={"ID":"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037","Type":"ContainerStarted","Data":"85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344"} Feb 27 06:36:33 crc kubenswrapper[4725]: I0227 06:36:33.501948 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" exitCode=0 Feb 27 06:36:33 crc kubenswrapper[4725]: I0227 06:36:33.502030 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d"} Feb 27 06:36:33 crc kubenswrapper[4725]: I0227 06:36:33.502170 4725 scope.go:117] "RemoveContainer" containerID="28426146ca35fe9273793a5717f9e41fb0368e16b25b6d5b4d504e333b929ead" Feb 27 06:36:33 crc kubenswrapper[4725]: I0227 06:36:33.502984 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:36:33 crc kubenswrapper[4725]: E0227 06:36:33.503415 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:36:33 crc kubenswrapper[4725]: I0227 06:36:33.526668 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ht56r" podStartSLOduration=5.089773826 podStartE2EDuration="9.526645394s" podCreationTimestamp="2026-02-27 06:36:24 +0000 UTC" firstStartedPulling="2026-02-27 06:36:28.443207712 +0000 UTC m=+1566.905828281" lastFinishedPulling="2026-02-27 06:36:32.88007926 +0000 UTC m=+1571.342699849" observedRunningTime="2026-02-27 06:36:33.523421694 +0000 UTC m=+1571.986042303" watchObservedRunningTime="2026-02-27 06:36:33.526645394 +0000 UTC m=+1571.989265973" Feb 27 06:36:35 crc kubenswrapper[4725]: I0227 06:36:35.202773 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:35 crc kubenswrapper[4725]: I0227 06:36:35.203248 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:35 crc kubenswrapper[4725]: I0227 06:36:35.272609 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:39 crc kubenswrapper[4725]: I0227 06:36:39.581496 4725 generic.go:334] "Generic (PLEG): container finished" podID="a5ce3d2f-4b00-4971-a37f-3217fd19665a" containerID="fcdc88e605c4a536b19a69f3d5fe7df177ca1c7a71b0d701b6eb86da535f8b76" exitCode=0 Feb 27 06:36:39 crc kubenswrapper[4725]: I0227 06:36:39.581612 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" event={"ID":"a5ce3d2f-4b00-4971-a37f-3217fd19665a","Type":"ContainerDied","Data":"fcdc88e605c4a536b19a69f3d5fe7df177ca1c7a71b0d701b6eb86da535f8b76"} Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.073581 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.217468 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-inventory\") pod \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.217521 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnqjx\" (UniqueName: \"kubernetes.io/projected/a5ce3d2f-4b00-4971-a37f-3217fd19665a-kube-api-access-qnqjx\") pod \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.217672 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-ssh-key-openstack-edpm-ipam\") pod \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.217737 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-repo-setup-combined-ca-bundle\") pod \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\" (UID: \"a5ce3d2f-4b00-4971-a37f-3217fd19665a\") " Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.224236 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ce3d2f-4b00-4971-a37f-3217fd19665a-kube-api-access-qnqjx" (OuterVolumeSpecName: "kube-api-access-qnqjx") pod "a5ce3d2f-4b00-4971-a37f-3217fd19665a" (UID: "a5ce3d2f-4b00-4971-a37f-3217fd19665a"). InnerVolumeSpecName "kube-api-access-qnqjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.239436 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a5ce3d2f-4b00-4971-a37f-3217fd19665a" (UID: "a5ce3d2f-4b00-4971-a37f-3217fd19665a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.247327 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a5ce3d2f-4b00-4971-a37f-3217fd19665a" (UID: "a5ce3d2f-4b00-4971-a37f-3217fd19665a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.263822 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-inventory" (OuterVolumeSpecName: "inventory") pod "a5ce3d2f-4b00-4971-a37f-3217fd19665a" (UID: "a5ce3d2f-4b00-4971-a37f-3217fd19665a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.320036 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.320188 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnqjx\" (UniqueName: \"kubernetes.io/projected/a5ce3d2f-4b00-4971-a37f-3217fd19665a-kube-api-access-qnqjx\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.320248 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.320323 4725 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ce3d2f-4b00-4971-a37f-3217fd19665a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.615838 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" event={"ID":"a5ce3d2f-4b00-4971-a37f-3217fd19665a","Type":"ContainerDied","Data":"97dfac03c06889bab997cb5ce81722bc3a27a7ecd0acb66c83fc511cdd7e77f3"} Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.616121 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97dfac03c06889bab997cb5ce81722bc3a27a7ecd0acb66c83fc511cdd7e77f3" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.615968 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.731475 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw"] Feb 27 06:36:41 crc kubenswrapper[4725]: E0227 06:36:41.733864 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ce3d2f-4b00-4971-a37f-3217fd19665a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.734025 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ce3d2f-4b00-4971-a37f-3217fd19665a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.734524 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ce3d2f-4b00-4971-a37f-3217fd19665a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.736222 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.739636 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.740422 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.740798 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.745621 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.749178 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw"] Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.933901 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.933953 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:41 crc kubenswrapper[4725]: I0227 06:36:41.933974 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwp6j\" (UniqueName: \"kubernetes.io/projected/3567a664-44a4-4138-82ec-f35dbffffb40-kube-api-access-pwp6j\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.036356 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.036425 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.036456 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwp6j\" (UniqueName: \"kubernetes.io/projected/3567a664-44a4-4138-82ec-f35dbffffb40-kube-api-access-pwp6j\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.041424 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.041499 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.057932 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwp6j\" (UniqueName: \"kubernetes.io/projected/3567a664-44a4-4138-82ec-f35dbffffb40-kube-api-access-pwp6j\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7qtxw\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.116222 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:42 crc kubenswrapper[4725]: I0227 06:36:42.687476 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw"] Feb 27 06:36:42 crc kubenswrapper[4725]: W0227 06:36:42.692542 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3567a664_44a4_4138_82ec_f35dbffffb40.slice/crio-44430ff0400bf13605e7a8a5150bc64750a75b7af58fbc87f7e5e38a4d2438e6 WatchSource:0}: Error finding container 44430ff0400bf13605e7a8a5150bc64750a75b7af58fbc87f7e5e38a4d2438e6: Status 404 returned error can't find the container with id 44430ff0400bf13605e7a8a5150bc64750a75b7af58fbc87f7e5e38a4d2438e6 Feb 27 06:36:43 crc kubenswrapper[4725]: I0227 06:36:43.634882 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" event={"ID":"3567a664-44a4-4138-82ec-f35dbffffb40","Type":"ContainerStarted","Data":"8b3a13139c1e35c68508defd90995fecd809e00b8f73140267e5258601db49a7"} Feb 27 06:36:43 crc kubenswrapper[4725]: I0227 06:36:43.635653 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" event={"ID":"3567a664-44a4-4138-82ec-f35dbffffb40","Type":"ContainerStarted","Data":"44430ff0400bf13605e7a8a5150bc64750a75b7af58fbc87f7e5e38a4d2438e6"} Feb 27 06:36:43 crc kubenswrapper[4725]: I0227 06:36:43.653106 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" podStartSLOduration=2.153567596 podStartE2EDuration="2.653088148s" podCreationTimestamp="2026-02-27 06:36:41 +0000 UTC" firstStartedPulling="2026-02-27 06:36:42.695446411 +0000 UTC m=+1581.158066990" lastFinishedPulling="2026-02-27 06:36:43.194966973 +0000 UTC m=+1581.657587542" observedRunningTime="2026-02-27 06:36:43.651661038 +0000 UTC m=+1582.114281617" watchObservedRunningTime="2026-02-27 06:36:43.653088148 +0000 UTC m=+1582.115708717" Feb 27 06:36:45 crc kubenswrapper[4725]: I0227 06:36:45.284740 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:45 crc kubenswrapper[4725]: I0227 06:36:45.344650 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ht56r"] Feb 27 06:36:45 crc kubenswrapper[4725]: I0227 06:36:45.652862 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ht56r" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="registry-server" containerID="cri-o://85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344" gracePeriod=2 Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.130893 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.321958 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-catalog-content\") pod \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.322250 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74pgd\" (UniqueName: \"kubernetes.io/projected/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-kube-api-access-74pgd\") pod \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.322344 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-utilities\") pod \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\" (UID: \"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037\") " Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.324388 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-utilities" (OuterVolumeSpecName: "utilities") pod "5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" (UID: "5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.327769 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-kube-api-access-74pgd" (OuterVolumeSpecName: "kube-api-access-74pgd") pod "5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" (UID: "5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037"). InnerVolumeSpecName "kube-api-access-74pgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.379889 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" (UID: "5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.425921 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74pgd\" (UniqueName: \"kubernetes.io/projected/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-kube-api-access-74pgd\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.425953 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.425965 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.665803 4725 generic.go:334] "Generic (PLEG): container finished" podID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerID="85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344" exitCode=0 Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.665892 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht56r" event={"ID":"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037","Type":"ContainerDied","Data":"85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344"} Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.665891 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht56r" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.665941 4725 scope.go:117] "RemoveContainer" containerID="85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.665927 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht56r" event={"ID":"5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037","Type":"ContainerDied","Data":"bd89e0521f8a8158be3fb5ebdf5612661a8133931d7bcaaf24c711192f11f335"} Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.667959 4725 generic.go:334] "Generic (PLEG): container finished" podID="3567a664-44a4-4138-82ec-f35dbffffb40" containerID="8b3a13139c1e35c68508defd90995fecd809e00b8f73140267e5258601db49a7" exitCode=0 Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.667990 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" event={"ID":"3567a664-44a4-4138-82ec-f35dbffffb40","Type":"ContainerDied","Data":"8b3a13139c1e35c68508defd90995fecd809e00b8f73140267e5258601db49a7"} Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.701785 4725 scope.go:117] "RemoveContainer" containerID="f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.757740 4725 scope.go:117] "RemoveContainer" containerID="b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.764058 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ht56r"] Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.783084 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ht56r"] Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.804550 4725 scope.go:117] "RemoveContainer" containerID="85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344" Feb 27 06:36:46 crc kubenswrapper[4725]: E0227 06:36:46.805660 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344\": container with ID starting with 85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344 not found: ID does not exist" containerID="85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.805703 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344"} err="failed to get container status \"85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344\": rpc error: code = NotFound desc = could not find container \"85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344\": container with ID starting with 85e9965cd93d5a63fdd337d6be52048d66eff1faf9d9ed2e7e488a909e5fc344 not found: ID does not exist" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.805729 4725 scope.go:117] "RemoveContainer" containerID="f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7" Feb 27 06:36:46 crc kubenswrapper[4725]: E0227 06:36:46.806684 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7\": container with ID starting with f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7 not found: ID does not exist" containerID="f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.806735 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7"} err="failed to get container status \"f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7\": rpc error: code = NotFound desc = could not find container \"f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7\": container with ID starting with f551c394833638bbc22ca856e10df1dabee90e5e7ecc58b37345e67f718264f7 not found: ID does not exist" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.806763 4725 scope.go:117] "RemoveContainer" containerID="b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b" Feb 27 06:36:46 crc kubenswrapper[4725]: E0227 06:36:46.806982 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b\": container with ID starting with b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b not found: ID does not exist" containerID="b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b" Feb 27 06:36:46 crc kubenswrapper[4725]: I0227 06:36:46.807008 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b"} err="failed to get container status \"b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b\": rpc error: code = NotFound desc = could not find container \"b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b\": container with ID starting with b70ea48e2d5c2e4f021b641cf4b4d2c31fc3a1fe82a679f59b71eb985442083b not found: ID does not exist" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.168673 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.252268 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:36:48 crc kubenswrapper[4725]: E0227 06:36:48.252665 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.270796 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" path="/var/lib/kubelet/pods/5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037/volumes" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.367983 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-ssh-key-openstack-edpm-ipam\") pod \"3567a664-44a4-4138-82ec-f35dbffffb40\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.368357 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-inventory\") pod \"3567a664-44a4-4138-82ec-f35dbffffb40\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.368460 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwp6j\" (UniqueName: \"kubernetes.io/projected/3567a664-44a4-4138-82ec-f35dbffffb40-kube-api-access-pwp6j\") pod \"3567a664-44a4-4138-82ec-f35dbffffb40\" (UID: \"3567a664-44a4-4138-82ec-f35dbffffb40\") " Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.377558 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3567a664-44a4-4138-82ec-f35dbffffb40-kube-api-access-pwp6j" (OuterVolumeSpecName: "kube-api-access-pwp6j") pod "3567a664-44a4-4138-82ec-f35dbffffb40" (UID: "3567a664-44a4-4138-82ec-f35dbffffb40"). InnerVolumeSpecName "kube-api-access-pwp6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.397998 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3567a664-44a4-4138-82ec-f35dbffffb40" (UID: "3567a664-44a4-4138-82ec-f35dbffffb40"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.409727 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-inventory" (OuterVolumeSpecName: "inventory") pod "3567a664-44a4-4138-82ec-f35dbffffb40" (UID: "3567a664-44a4-4138-82ec-f35dbffffb40"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.470366 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.470401 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwp6j\" (UniqueName: \"kubernetes.io/projected/3567a664-44a4-4138-82ec-f35dbffffb40-kube-api-access-pwp6j\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.470417 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3567a664-44a4-4138-82ec-f35dbffffb40-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.704697 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" event={"ID":"3567a664-44a4-4138-82ec-f35dbffffb40","Type":"ContainerDied","Data":"44430ff0400bf13605e7a8a5150bc64750a75b7af58fbc87f7e5e38a4d2438e6"} Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.704764 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44430ff0400bf13605e7a8a5150bc64750a75b7af58fbc87f7e5e38a4d2438e6" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.704842 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7qtxw" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.818858 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74"] Feb 27 06:36:48 crc kubenswrapper[4725]: E0227 06:36:48.819503 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="extract-utilities" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.819537 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="extract-utilities" Feb 27 06:36:48 crc kubenswrapper[4725]: E0227 06:36:48.819581 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3567a664-44a4-4138-82ec-f35dbffffb40" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.819595 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="3567a664-44a4-4138-82ec-f35dbffffb40" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 06:36:48 crc kubenswrapper[4725]: E0227 06:36:48.819618 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="extract-content" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.819628 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="extract-content" Feb 27 06:36:48 crc kubenswrapper[4725]: E0227 06:36:48.819652 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="registry-server" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.819672 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="registry-server" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.819941 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="3567a664-44a4-4138-82ec-f35dbffffb40" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.819966 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc1a3b3-6c9d-44c9-a5ed-81e4b6ef3037" containerName="registry-server" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.821248 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.823511 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.823703 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.823819 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.826831 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.834619 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74"] Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.879233 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.879612 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.879655 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.879752 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jk7\" (UniqueName: \"kubernetes.io/projected/6bd144c7-5dac-46d4-8cab-b3a31a352974-kube-api-access-27jk7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.980876 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.981012 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.981063 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.981164 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jk7\" (UniqueName: \"kubernetes.io/projected/6bd144c7-5dac-46d4-8cab-b3a31a352974-kube-api-access-27jk7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.985768 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.991558 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.993914 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:48 crc kubenswrapper[4725]: I0227 06:36:48.996467 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jk7\" (UniqueName: \"kubernetes.io/projected/6bd144c7-5dac-46d4-8cab-b3a31a352974-kube-api-access-27jk7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:49 crc kubenswrapper[4725]: I0227 06:36:49.145530 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:36:49 crc kubenswrapper[4725]: I0227 06:36:49.738421 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74"] Feb 27 06:36:50 crc kubenswrapper[4725]: I0227 06:36:50.740483 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" event={"ID":"6bd144c7-5dac-46d4-8cab-b3a31a352974","Type":"ContainerStarted","Data":"338f1512fbdd710ffb6e13cf5dc5c23b77a7788de40dbe239dd65129b6c07ace"} Feb 27 06:36:50 crc kubenswrapper[4725]: I0227 06:36:50.741134 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" event={"ID":"6bd144c7-5dac-46d4-8cab-b3a31a352974","Type":"ContainerStarted","Data":"0d42e054e416d44f91c6729b6aaf3e0a87a98aab1d9e63724da51c1baf65a1a0"} Feb 27 06:36:50 crc kubenswrapper[4725]: I0227 06:36:50.764655 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" podStartSLOduration=2.338670892 podStartE2EDuration="2.764632352s" podCreationTimestamp="2026-02-27 06:36:48 +0000 UTC" firstStartedPulling="2026-02-27 06:36:49.743707732 +0000 UTC m=+1588.206328301" lastFinishedPulling="2026-02-27 06:36:50.169669182 +0000 UTC m=+1588.632289761" observedRunningTime="2026-02-27 06:36:50.761487723 +0000 UTC m=+1589.224108302" watchObservedRunningTime="2026-02-27 06:36:50.764632352 +0000 UTC m=+1589.227252961" Feb 27 06:37:00 crc kubenswrapper[4725]: I0227 06:37:00.508799 4725 scope.go:117] "RemoveContainer" containerID="d5d9320113c890c37f0856d858e7bc52c906fea1717b0c40f482e7db39672ea4" Feb 27 06:37:00 crc kubenswrapper[4725]: I0227 06:37:00.547447 4725 scope.go:117] "RemoveContainer" containerID="a6361d62d111ad5e17d07b994b50dc79f8c362ecc94e02190c469c11966749d9" Feb 27 06:37:00 crc kubenswrapper[4725]: I0227 06:37:00.623252 4725 scope.go:117] "RemoveContainer" containerID="2b67a7c7e9507fe801321c5b5c42386650fa86500c931c5f96c64d0817d01649" Feb 27 06:37:03 crc kubenswrapper[4725]: I0227 06:37:03.251144 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:37:03 crc kubenswrapper[4725]: E0227 06:37:03.251927 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:37:18 crc kubenswrapper[4725]: I0227 06:37:18.252217 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:37:18 crc kubenswrapper[4725]: E0227 06:37:18.253142 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:37:32 crc kubenswrapper[4725]: I0227 06:37:32.264644 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:37:32 crc kubenswrapper[4725]: E0227 06:37:32.266993 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:37:45 crc kubenswrapper[4725]: I0227 06:37:45.252353 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:37:45 crc kubenswrapper[4725]: E0227 06:37:45.253153 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.254338 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:37:57 crc kubenswrapper[4725]: E0227 06:37:57.255482 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.340761 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gb4tt"] Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.343689 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.351656 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gb4tt"] Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.442461 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g25b\" (UniqueName: \"kubernetes.io/projected/85bedea8-2295-46a6-ad38-95aab7ffc6d5-kube-api-access-4g25b\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.442800 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-utilities\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.442969 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-catalog-content\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.545577 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g25b\" (UniqueName: \"kubernetes.io/projected/85bedea8-2295-46a6-ad38-95aab7ffc6d5-kube-api-access-4g25b\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.545987 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-utilities\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.546064 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-catalog-content\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.546496 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-utilities\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.546619 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-catalog-content\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.565585 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g25b\" (UniqueName: \"kubernetes.io/projected/85bedea8-2295-46a6-ad38-95aab7ffc6d5-kube-api-access-4g25b\") pod \"certified-operators-gb4tt\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:57 crc kubenswrapper[4725]: I0227 06:37:57.674825 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:37:58 crc kubenswrapper[4725]: I0227 06:37:58.189742 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gb4tt"] Feb 27 06:37:58 crc kubenswrapper[4725]: I0227 06:37:58.498827 4725 generic.go:334] "Generic (PLEG): container finished" podID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerID="31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f" exitCode=0 Feb 27 06:37:58 crc kubenswrapper[4725]: I0227 06:37:58.498896 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb4tt" event={"ID":"85bedea8-2295-46a6-ad38-95aab7ffc6d5","Type":"ContainerDied","Data":"31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f"} Feb 27 06:37:58 crc kubenswrapper[4725]: I0227 06:37:58.499135 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb4tt" event={"ID":"85bedea8-2295-46a6-ad38-95aab7ffc6d5","Type":"ContainerStarted","Data":"218cb51819b35f341ca876f7e1b036c0fb9a29dba1562a8251cb37ae6f2bcdf4"} Feb 27 06:37:59 crc kubenswrapper[4725]: I0227 06:37:59.512645 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb4tt" event={"ID":"85bedea8-2295-46a6-ad38-95aab7ffc6d5","Type":"ContainerStarted","Data":"ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d"} Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.151223 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536238-p95pp"] Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.153174 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536238-p95pp" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.156612 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.156967 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.157028 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.163951 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536238-p95pp"] Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.313392 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dftl9\" (UniqueName: \"kubernetes.io/projected/01523399-3075-45db-8d71-e7adcd7c6e5a-kube-api-access-dftl9\") pod \"auto-csr-approver-29536238-p95pp\" (UID: \"01523399-3075-45db-8d71-e7adcd7c6e5a\") " pod="openshift-infra/auto-csr-approver-29536238-p95pp" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.415753 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dftl9\" (UniqueName: \"kubernetes.io/projected/01523399-3075-45db-8d71-e7adcd7c6e5a-kube-api-access-dftl9\") pod \"auto-csr-approver-29536238-p95pp\" (UID: \"01523399-3075-45db-8d71-e7adcd7c6e5a\") " pod="openshift-infra/auto-csr-approver-29536238-p95pp" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.449972 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dftl9\" (UniqueName: \"kubernetes.io/projected/01523399-3075-45db-8d71-e7adcd7c6e5a-kube-api-access-dftl9\") pod \"auto-csr-approver-29536238-p95pp\" (UID: \"01523399-3075-45db-8d71-e7adcd7c6e5a\") " pod="openshift-infra/auto-csr-approver-29536238-p95pp" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.475395 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536238-p95pp" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.812037 4725 scope.go:117] "RemoveContainer" containerID="af9a5ea31ff62326be7796d50e91aa79a890cd4050b974077a59b842f558133b" Feb 27 06:38:00 crc kubenswrapper[4725]: I0227 06:38:00.993147 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536238-p95pp"] Feb 27 06:38:01 crc kubenswrapper[4725]: W0227 06:38:01.018592 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01523399_3075_45db_8d71_e7adcd7c6e5a.slice/crio-f966c6809adea68c6ecd1bf0f4330c956807c6295ee07d5c99299109a6464b26 WatchSource:0}: Error finding container f966c6809adea68c6ecd1bf0f4330c956807c6295ee07d5c99299109a6464b26: Status 404 returned error can't find the container with id f966c6809adea68c6ecd1bf0f4330c956807c6295ee07d5c99299109a6464b26 Feb 27 06:38:01 crc kubenswrapper[4725]: I0227 06:38:01.536824 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536238-p95pp" event={"ID":"01523399-3075-45db-8d71-e7adcd7c6e5a","Type":"ContainerStarted","Data":"f966c6809adea68c6ecd1bf0f4330c956807c6295ee07d5c99299109a6464b26"} Feb 27 06:38:01 crc kubenswrapper[4725]: I0227 06:38:01.539566 4725 generic.go:334] "Generic (PLEG): container finished" podID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerID="ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d" exitCode=0 Feb 27 06:38:01 crc kubenswrapper[4725]: I0227 06:38:01.539611 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb4tt" event={"ID":"85bedea8-2295-46a6-ad38-95aab7ffc6d5","Type":"ContainerDied","Data":"ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d"} Feb 27 06:38:02 crc kubenswrapper[4725]: I0227 06:38:02.550793 4725 generic.go:334] "Generic (PLEG): container finished" podID="01523399-3075-45db-8d71-e7adcd7c6e5a" containerID="3541b814b0fae0eb3a9e991dda2a227e7421c9d59d42990f2b1a6f091b0203ab" exitCode=0 Feb 27 06:38:02 crc kubenswrapper[4725]: I0227 06:38:02.550977 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536238-p95pp" event={"ID":"01523399-3075-45db-8d71-e7adcd7c6e5a","Type":"ContainerDied","Data":"3541b814b0fae0eb3a9e991dda2a227e7421c9d59d42990f2b1a6f091b0203ab"} Feb 27 06:38:02 crc kubenswrapper[4725]: I0227 06:38:02.553332 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb4tt" event={"ID":"85bedea8-2295-46a6-ad38-95aab7ffc6d5","Type":"ContainerStarted","Data":"3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27"} Feb 27 06:38:02 crc kubenswrapper[4725]: I0227 06:38:02.589023 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gb4tt" podStartSLOduration=2.128183471 podStartE2EDuration="5.589004854s" podCreationTimestamp="2026-02-27 06:37:57 +0000 UTC" firstStartedPulling="2026-02-27 06:37:58.500632527 +0000 UTC m=+1656.963253106" lastFinishedPulling="2026-02-27 06:38:01.96145387 +0000 UTC m=+1660.424074489" observedRunningTime="2026-02-27 06:38:02.58777951 +0000 UTC m=+1661.050400089" watchObservedRunningTime="2026-02-27 06:38:02.589004854 +0000 UTC m=+1661.051625423" Feb 27 06:38:03 crc kubenswrapper[4725]: I0227 06:38:03.947824 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536238-p95pp" Feb 27 06:38:04 crc kubenswrapper[4725]: I0227 06:38:04.087812 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dftl9\" (UniqueName: \"kubernetes.io/projected/01523399-3075-45db-8d71-e7adcd7c6e5a-kube-api-access-dftl9\") pod \"01523399-3075-45db-8d71-e7adcd7c6e5a\" (UID: \"01523399-3075-45db-8d71-e7adcd7c6e5a\") " Feb 27 06:38:04 crc kubenswrapper[4725]: I0227 06:38:04.093127 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01523399-3075-45db-8d71-e7adcd7c6e5a-kube-api-access-dftl9" (OuterVolumeSpecName: "kube-api-access-dftl9") pod "01523399-3075-45db-8d71-e7adcd7c6e5a" (UID: "01523399-3075-45db-8d71-e7adcd7c6e5a"). InnerVolumeSpecName "kube-api-access-dftl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:38:04 crc kubenswrapper[4725]: I0227 06:38:04.190317 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dftl9\" (UniqueName: \"kubernetes.io/projected/01523399-3075-45db-8d71-e7adcd7c6e5a-kube-api-access-dftl9\") on node \"crc\" DevicePath \"\"" Feb 27 06:38:04 crc kubenswrapper[4725]: I0227 06:38:04.573983 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536238-p95pp" event={"ID":"01523399-3075-45db-8d71-e7adcd7c6e5a","Type":"ContainerDied","Data":"f966c6809adea68c6ecd1bf0f4330c956807c6295ee07d5c99299109a6464b26"} Feb 27 06:38:04 crc kubenswrapper[4725]: I0227 06:38:04.574031 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f966c6809adea68c6ecd1bf0f4330c956807c6295ee07d5c99299109a6464b26" Feb 27 06:38:04 crc kubenswrapper[4725]: I0227 06:38:04.574093 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536238-p95pp" Feb 27 06:38:05 crc kubenswrapper[4725]: I0227 06:38:05.019380 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536232-8xqk7"] Feb 27 06:38:05 crc kubenswrapper[4725]: I0227 06:38:05.032769 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536232-8xqk7"] Feb 27 06:38:06 crc kubenswrapper[4725]: I0227 06:38:06.269276 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5976615b-8ad2-4d95-9702-5b003064ee5c" path="/var/lib/kubelet/pods/5976615b-8ad2-4d95-9702-5b003064ee5c/volumes" Feb 27 06:38:07 crc kubenswrapper[4725]: I0227 06:38:07.675773 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:38:07 crc kubenswrapper[4725]: I0227 06:38:07.676119 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:38:08 crc kubenswrapper[4725]: I0227 06:38:08.252152 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:38:08 crc kubenswrapper[4725]: E0227 06:38:08.252404 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:38:08 crc kubenswrapper[4725]: I0227 06:38:08.721870 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gb4tt" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="registry-server" probeResult="failure" output=< Feb 27 06:38:08 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:38:08 crc kubenswrapper[4725]: > Feb 27 06:38:17 crc kubenswrapper[4725]: I0227 06:38:17.731085 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:38:17 crc kubenswrapper[4725]: I0227 06:38:17.800017 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:38:17 crc kubenswrapper[4725]: I0227 06:38:17.963485 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gb4tt"] Feb 27 06:38:18 crc kubenswrapper[4725]: I0227 06:38:18.776945 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gb4tt" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="registry-server" containerID="cri-o://3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27" gracePeriod=2 Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.302431 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.329672 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-catalog-content\") pod \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.329892 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g25b\" (UniqueName: \"kubernetes.io/projected/85bedea8-2295-46a6-ad38-95aab7ffc6d5-kube-api-access-4g25b\") pod \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.329947 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-utilities\") pod \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\" (UID: \"85bedea8-2295-46a6-ad38-95aab7ffc6d5\") " Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.331974 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-utilities" (OuterVolumeSpecName: "utilities") pod "85bedea8-2295-46a6-ad38-95aab7ffc6d5" (UID: "85bedea8-2295-46a6-ad38-95aab7ffc6d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.345990 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bedea8-2295-46a6-ad38-95aab7ffc6d5-kube-api-access-4g25b" (OuterVolumeSpecName: "kube-api-access-4g25b") pod "85bedea8-2295-46a6-ad38-95aab7ffc6d5" (UID: "85bedea8-2295-46a6-ad38-95aab7ffc6d5"). InnerVolumeSpecName "kube-api-access-4g25b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.397410 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85bedea8-2295-46a6-ad38-95aab7ffc6d5" (UID: "85bedea8-2295-46a6-ad38-95aab7ffc6d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.431937 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g25b\" (UniqueName: \"kubernetes.io/projected/85bedea8-2295-46a6-ad38-95aab7ffc6d5-kube-api-access-4g25b\") on node \"crc\" DevicePath \"\"" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.431977 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.431986 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bedea8-2295-46a6-ad38-95aab7ffc6d5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.787020 4725 generic.go:334] "Generic (PLEG): container finished" podID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerID="3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27" exitCode=0 Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.787065 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb4tt" event={"ID":"85bedea8-2295-46a6-ad38-95aab7ffc6d5","Type":"ContainerDied","Data":"3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27"} Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.787142 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb4tt" event={"ID":"85bedea8-2295-46a6-ad38-95aab7ffc6d5","Type":"ContainerDied","Data":"218cb51819b35f341ca876f7e1b036c0fb9a29dba1562a8251cb37ae6f2bcdf4"} Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.787186 4725 scope.go:117] "RemoveContainer" containerID="3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.788937 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb4tt" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.815413 4725 scope.go:117] "RemoveContainer" containerID="ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.843779 4725 scope.go:117] "RemoveContainer" containerID="31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.848105 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gb4tt"] Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.861165 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gb4tt"] Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.897786 4725 scope.go:117] "RemoveContainer" containerID="3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27" Feb 27 06:38:19 crc kubenswrapper[4725]: E0227 06:38:19.898467 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27\": container with ID starting with 3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27 not found: ID does not exist" containerID="3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.898502 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27"} err="failed to get container status \"3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27\": rpc error: code = NotFound desc = could not find container \"3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27\": container with ID starting with 3fe70e8b4f0a5c1b6349a773dc581b4152844a84878d9447161fad71a439ef27 not found: ID does not exist" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.898529 4725 scope.go:117] "RemoveContainer" containerID="ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d" Feb 27 06:38:19 crc kubenswrapper[4725]: E0227 06:38:19.898773 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d\": container with ID starting with ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d not found: ID does not exist" containerID="ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.898803 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d"} err="failed to get container status \"ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d\": rpc error: code = NotFound desc = could not find container \"ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d\": container with ID starting with ef36e1ef01f82f064c4a3099b9e809e77fe631f10cbc63e4ec5315bc9ba9df5d not found: ID does not exist" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.898820 4725 scope.go:117] "RemoveContainer" containerID="31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f" Feb 27 06:38:19 crc kubenswrapper[4725]: E0227 06:38:19.899070 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f\": container with ID starting with 31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f not found: ID does not exist" containerID="31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f" Feb 27 06:38:19 crc kubenswrapper[4725]: I0227 06:38:19.899093 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f"} err="failed to get container status \"31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f\": rpc error: code = NotFound desc = could not find container \"31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f\": container with ID starting with 31446a1f289a3654505af8258d1aa343d5be543c793afab59d8f74529eae605f not found: ID does not exist" Feb 27 06:38:20 crc kubenswrapper[4725]: I0227 06:38:20.251903 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:38:20 crc kubenswrapper[4725]: E0227 06:38:20.252222 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:38:20 crc kubenswrapper[4725]: I0227 06:38:20.262215 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" path="/var/lib/kubelet/pods/85bedea8-2295-46a6-ad38-95aab7ffc6d5/volumes" Feb 27 06:38:31 crc kubenswrapper[4725]: I0227 06:38:31.251494 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:38:31 crc kubenswrapper[4725]: E0227 06:38:31.252290 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:38:42 crc kubenswrapper[4725]: I0227 06:38:42.261746 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:38:42 crc kubenswrapper[4725]: E0227 06:38:42.263892 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:38:57 crc kubenswrapper[4725]: I0227 06:38:57.251562 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:38:57 crc kubenswrapper[4725]: E0227 06:38:57.252565 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:39:00 crc kubenswrapper[4725]: I0227 06:39:00.889444 4725 scope.go:117] "RemoveContainer" containerID="eb26b6f9487a111ec10255aa47f4c6aedd16d7e02c896d1ff37e665ab035efa5" Feb 27 06:39:08 crc kubenswrapper[4725]: I0227 06:39:08.252183 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:39:08 crc kubenswrapper[4725]: E0227 06:39:08.252980 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:39:19 crc kubenswrapper[4725]: I0227 06:39:19.252045 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:39:19 crc kubenswrapper[4725]: E0227 06:39:19.253072 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:39:30 crc kubenswrapper[4725]: I0227 06:39:30.293782 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:39:30 crc kubenswrapper[4725]: E0227 06:39:30.295626 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:39:41 crc kubenswrapper[4725]: I0227 06:39:41.251394 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:39:41 crc kubenswrapper[4725]: E0227 06:39:41.252361 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:39:56 crc kubenswrapper[4725]: I0227 06:39:56.252640 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:39:56 crc kubenswrapper[4725]: E0227 06:39:56.253987 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.177934 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536240-c7jgl"] Feb 27 06:40:00 crc kubenswrapper[4725]: E0227 06:40:00.179604 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01523399-3075-45db-8d71-e7adcd7c6e5a" containerName="oc" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.179637 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="01523399-3075-45db-8d71-e7adcd7c6e5a" containerName="oc" Feb 27 06:40:00 crc kubenswrapper[4725]: E0227 06:40:00.179663 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="registry-server" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.179677 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="registry-server" Feb 27 06:40:00 crc kubenswrapper[4725]: E0227 06:40:00.179720 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="extract-utilities" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.179735 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="extract-utilities" Feb 27 06:40:00 crc kubenswrapper[4725]: E0227 06:40:00.179753 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="extract-content" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.179766 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="extract-content" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.180148 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="01523399-3075-45db-8d71-e7adcd7c6e5a" containerName="oc" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.180184 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bedea8-2295-46a6-ad38-95aab7ffc6d5" containerName="registry-server" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.181706 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536240-c7jgl" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.186502 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.187019 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.187261 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.192120 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536240-c7jgl"] Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.294309 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2n86\" (UniqueName: \"kubernetes.io/projected/d9b9899d-9ed8-41a3-a1c5-c70459205e2a-kube-api-access-p2n86\") pod \"auto-csr-approver-29536240-c7jgl\" (UID: \"d9b9899d-9ed8-41a3-a1c5-c70459205e2a\") " pod="openshift-infra/auto-csr-approver-29536240-c7jgl" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.395925 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2n86\" (UniqueName: \"kubernetes.io/projected/d9b9899d-9ed8-41a3-a1c5-c70459205e2a-kube-api-access-p2n86\") pod \"auto-csr-approver-29536240-c7jgl\" (UID: \"d9b9899d-9ed8-41a3-a1c5-c70459205e2a\") " pod="openshift-infra/auto-csr-approver-29536240-c7jgl" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.419784 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2n86\" (UniqueName: \"kubernetes.io/projected/d9b9899d-9ed8-41a3-a1c5-c70459205e2a-kube-api-access-p2n86\") pod \"auto-csr-approver-29536240-c7jgl\" (UID: \"d9b9899d-9ed8-41a3-a1c5-c70459205e2a\") " pod="openshift-infra/auto-csr-approver-29536240-c7jgl" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.508344 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536240-c7jgl" Feb 27 06:40:00 crc kubenswrapper[4725]: I0227 06:40:00.999499 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536240-c7jgl"] Feb 27 06:40:01 crc kubenswrapper[4725]: I0227 06:40:01.023958 4725 scope.go:117] "RemoveContainer" containerID="fd1998b0e1b622db24ec2e1af09893d4ac39704d70bc87e391e687da67841d45" Feb 27 06:40:01 crc kubenswrapper[4725]: I0227 06:40:01.057453 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536240-c7jgl" event={"ID":"d9b9899d-9ed8-41a3-a1c5-c70459205e2a","Type":"ContainerStarted","Data":"6b2b3ccdafd06f2d36ec25f85e0492e2a589373db247ecea7dd59a14c22ed89b"} Feb 27 06:40:01 crc kubenswrapper[4725]: I0227 06:40:01.087659 4725 scope.go:117] "RemoveContainer" containerID="d2f10201c241ff6af707f25696d6b624017f53cc5590dac5e80941042f4faad5" Feb 27 06:40:01 crc kubenswrapper[4725]: I0227 06:40:01.124568 4725 scope.go:117] "RemoveContainer" containerID="bf811d13d002ced308326ee58960f034e249e82b4acebe5e02eec0b05d312fef" Feb 27 06:40:01 crc kubenswrapper[4725]: I0227 06:40:01.153637 4725 scope.go:117] "RemoveContainer" containerID="9d98baf31598be01014934cf4857b43b2a77e64c204543b5d7a508b969930958" Feb 27 06:40:01 crc kubenswrapper[4725]: I0227 06:40:01.197449 4725 scope.go:117] "RemoveContainer" containerID="829e3fd17516b62cd0b9a2a3bda351e02af1a836238e88f3a7d78e0bc339eb56" Feb 27 06:40:03 crc kubenswrapper[4725]: I0227 06:40:03.088041 4725 generic.go:334] "Generic (PLEG): container finished" podID="d9b9899d-9ed8-41a3-a1c5-c70459205e2a" containerID="cc901290260babac09388ab371c70964b7aaa477716e23601ce2a10d1c99cfe9" exitCode=0 Feb 27 06:40:03 crc kubenswrapper[4725]: I0227 06:40:03.088157 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536240-c7jgl" event={"ID":"d9b9899d-9ed8-41a3-a1c5-c70459205e2a","Type":"ContainerDied","Data":"cc901290260babac09388ab371c70964b7aaa477716e23601ce2a10d1c99cfe9"} Feb 27 06:40:04 crc kubenswrapper[4725]: I0227 06:40:04.421351 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536240-c7jgl" Feb 27 06:40:04 crc kubenswrapper[4725]: I0227 06:40:04.490600 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2n86\" (UniqueName: \"kubernetes.io/projected/d9b9899d-9ed8-41a3-a1c5-c70459205e2a-kube-api-access-p2n86\") pod \"d9b9899d-9ed8-41a3-a1c5-c70459205e2a\" (UID: \"d9b9899d-9ed8-41a3-a1c5-c70459205e2a\") " Feb 27 06:40:04 crc kubenswrapper[4725]: I0227 06:40:04.497894 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b9899d-9ed8-41a3-a1c5-c70459205e2a-kube-api-access-p2n86" (OuterVolumeSpecName: "kube-api-access-p2n86") pod "d9b9899d-9ed8-41a3-a1c5-c70459205e2a" (UID: "d9b9899d-9ed8-41a3-a1c5-c70459205e2a"). InnerVolumeSpecName "kube-api-access-p2n86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:40:04 crc kubenswrapper[4725]: I0227 06:40:04.592851 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2n86\" (UniqueName: \"kubernetes.io/projected/d9b9899d-9ed8-41a3-a1c5-c70459205e2a-kube-api-access-p2n86\") on node \"crc\" DevicePath \"\"" Feb 27 06:40:05 crc kubenswrapper[4725]: I0227 06:40:05.115675 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536240-c7jgl" event={"ID":"d9b9899d-9ed8-41a3-a1c5-c70459205e2a","Type":"ContainerDied","Data":"6b2b3ccdafd06f2d36ec25f85e0492e2a589373db247ecea7dd59a14c22ed89b"} Feb 27 06:40:05 crc kubenswrapper[4725]: I0227 06:40:05.115716 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2b3ccdafd06f2d36ec25f85e0492e2a589373db247ecea7dd59a14c22ed89b" Feb 27 06:40:05 crc kubenswrapper[4725]: I0227 06:40:05.115759 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536240-c7jgl" Feb 27 06:40:05 crc kubenswrapper[4725]: I0227 06:40:05.508720 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536234-868m6"] Feb 27 06:40:05 crc kubenswrapper[4725]: I0227 06:40:05.520163 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536234-868m6"] Feb 27 06:40:06 crc kubenswrapper[4725]: I0227 06:40:06.274644 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b093247-2faf-4e4f-9164-24ac3654ca46" path="/var/lib/kubelet/pods/4b093247-2faf-4e4f-9164-24ac3654ca46/volumes" Feb 27 06:40:07 crc kubenswrapper[4725]: I0227 06:40:07.252714 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:40:07 crc kubenswrapper[4725]: E0227 06:40:07.253184 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:40:08 crc kubenswrapper[4725]: I0227 06:40:08.149966 4725 generic.go:334] "Generic (PLEG): container finished" podID="6bd144c7-5dac-46d4-8cab-b3a31a352974" containerID="338f1512fbdd710ffb6e13cf5dc5c23b77a7788de40dbe239dd65129b6c07ace" exitCode=0 Feb 27 06:40:08 crc kubenswrapper[4725]: I0227 06:40:08.150057 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" event={"ID":"6bd144c7-5dac-46d4-8cab-b3a31a352974","Type":"ContainerDied","Data":"338f1512fbdd710ffb6e13cf5dc5c23b77a7788de40dbe239dd65129b6c07ace"} Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.640864 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.711021 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27jk7\" (UniqueName: \"kubernetes.io/projected/6bd144c7-5dac-46d4-8cab-b3a31a352974-kube-api-access-27jk7\") pod \"6bd144c7-5dac-46d4-8cab-b3a31a352974\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.711202 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-inventory\") pod \"6bd144c7-5dac-46d4-8cab-b3a31a352974\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.711385 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-bootstrap-combined-ca-bundle\") pod \"6bd144c7-5dac-46d4-8cab-b3a31a352974\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.711514 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-ssh-key-openstack-edpm-ipam\") pod \"6bd144c7-5dac-46d4-8cab-b3a31a352974\" (UID: \"6bd144c7-5dac-46d4-8cab-b3a31a352974\") " Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.717179 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6bd144c7-5dac-46d4-8cab-b3a31a352974" (UID: "6bd144c7-5dac-46d4-8cab-b3a31a352974"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.717270 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd144c7-5dac-46d4-8cab-b3a31a352974-kube-api-access-27jk7" (OuterVolumeSpecName: "kube-api-access-27jk7") pod "6bd144c7-5dac-46d4-8cab-b3a31a352974" (UID: "6bd144c7-5dac-46d4-8cab-b3a31a352974"). InnerVolumeSpecName "kube-api-access-27jk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.741295 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6bd144c7-5dac-46d4-8cab-b3a31a352974" (UID: "6bd144c7-5dac-46d4-8cab-b3a31a352974"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.763915 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-inventory" (OuterVolumeSpecName: "inventory") pod "6bd144c7-5dac-46d4-8cab-b3a31a352974" (UID: "6bd144c7-5dac-46d4-8cab-b3a31a352974"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.814079 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27jk7\" (UniqueName: \"kubernetes.io/projected/6bd144c7-5dac-46d4-8cab-b3a31a352974-kube-api-access-27jk7\") on node \"crc\" DevicePath \"\"" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.814122 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.814135 4725 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:40:09 crc kubenswrapper[4725]: I0227 06:40:09.814144 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd144c7-5dac-46d4-8cab-b3a31a352974-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.203742 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" event={"ID":"6bd144c7-5dac-46d4-8cab-b3a31a352974","Type":"ContainerDied","Data":"0d42e054e416d44f91c6729b6aaf3e0a87a98aab1d9e63724da51c1baf65a1a0"} Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.204098 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d42e054e416d44f91c6729b6aaf3e0a87a98aab1d9e63724da51c1baf65a1a0" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.203825 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.266933 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9"] Feb 27 06:40:10 crc kubenswrapper[4725]: E0227 06:40:10.267307 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b9899d-9ed8-41a3-a1c5-c70459205e2a" containerName="oc" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.267321 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b9899d-9ed8-41a3-a1c5-c70459205e2a" containerName="oc" Feb 27 06:40:10 crc kubenswrapper[4725]: E0227 06:40:10.267353 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd144c7-5dac-46d4-8cab-b3a31a352974" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.267361 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd144c7-5dac-46d4-8cab-b3a31a352974" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.267553 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd144c7-5dac-46d4-8cab-b3a31a352974" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.267582 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b9899d-9ed8-41a3-a1c5-c70459205e2a" containerName="oc" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.268301 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.271437 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.271931 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.272182 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.272412 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.284712 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9"] Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.327459 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.327508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.327545 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tjq\" (UniqueName: \"kubernetes.io/projected/1fab6c47-9849-428c-96a3-96c4cac71f69-kube-api-access-95tjq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.431718 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.431840 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.431942 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95tjq\" (UniqueName: \"kubernetes.io/projected/1fab6c47-9849-428c-96a3-96c4cac71f69-kube-api-access-95tjq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.438209 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.444977 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.453096 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95tjq\" (UniqueName: \"kubernetes.io/projected/1fab6c47-9849-428c-96a3-96c4cac71f69-kube-api-access-95tjq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:10 crc kubenswrapper[4725]: I0227 06:40:10.637214 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:40:11 crc kubenswrapper[4725]: I0227 06:40:11.162997 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9"] Feb 27 06:40:11 crc kubenswrapper[4725]: I0227 06:40:11.220094 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" event={"ID":"1fab6c47-9849-428c-96a3-96c4cac71f69","Type":"ContainerStarted","Data":"d6c8c613f29ee197fb3140559af8692372b8da030413be5a05ab37f38237916b"} Feb 27 06:40:12 crc kubenswrapper[4725]: I0227 06:40:12.229871 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" event={"ID":"1fab6c47-9849-428c-96a3-96c4cac71f69","Type":"ContainerStarted","Data":"aa6afdc9aebe31bbe996ab1b1f7205a941baefb9ca7fe41132c0cffc2b681889"} Feb 27 06:40:18 crc kubenswrapper[4725]: I0227 06:40:18.252547 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:40:18 crc kubenswrapper[4725]: E0227 06:40:18.253810 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:40:32 crc kubenswrapper[4725]: I0227 06:40:32.269388 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:40:32 crc kubenswrapper[4725]: E0227 06:40:32.270675 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:40:39 crc kubenswrapper[4725]: I0227 06:40:39.052029 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" podStartSLOduration=28.581395976 podStartE2EDuration="29.052004988s" podCreationTimestamp="2026-02-27 06:40:10 +0000 UTC" firstStartedPulling="2026-02-27 06:40:11.16269896 +0000 UTC m=+1789.625319529" lastFinishedPulling="2026-02-27 06:40:11.633307952 +0000 UTC m=+1790.095928541" observedRunningTime="2026-02-27 06:40:12.251054338 +0000 UTC m=+1790.713674917" watchObservedRunningTime="2026-02-27 06:40:39.052004988 +0000 UTC m=+1817.514625567" Feb 27 06:40:39 crc kubenswrapper[4725]: I0227 06:40:39.060804 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-x6dd6"] Feb 27 06:40:39 crc kubenswrapper[4725]: I0227 06:40:39.078325 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e1ec-account-create-update-qv24h"] Feb 27 06:40:39 crc kubenswrapper[4725]: I0227 06:40:39.092560 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ssnn8"] Feb 27 06:40:39 crc kubenswrapper[4725]: I0227 06:40:39.100646 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-x6dd6"] Feb 27 06:40:39 crc kubenswrapper[4725]: I0227 06:40:39.108915 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e1ec-account-create-update-qv24h"] Feb 27 06:40:39 crc kubenswrapper[4725]: I0227 06:40:39.117637 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ssnn8"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.053587 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xj5xl"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.072033 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d421-account-create-update-4kxss"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.083442 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xj5xl"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.094183 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-s9fpl"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.104104 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-a42a-account-create-update-cc5w5"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.112153 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d421-account-create-update-4kxss"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.120924 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4bcd-account-create-update-fxvct"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.140671 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-s9fpl"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.153003 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-a42a-account-create-update-cc5w5"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.165213 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4bcd-account-create-update-fxvct"] Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.271073 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06bb37bd-657c-48b6-9ed9-7039b6e7211f" path="/var/lib/kubelet/pods/06bb37bd-657c-48b6-9ed9-7039b6e7211f/volumes" Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.272625 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de12797-1e77-407c-a08f-52ae3855f836" path="/var/lib/kubelet/pods/2de12797-1e77-407c-a08f-52ae3855f836/volumes" Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.274160 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44334cb8-8e0a-4fb3-976e-b140f4c4f79b" path="/var/lib/kubelet/pods/44334cb8-8e0a-4fb3-976e-b140f4c4f79b/volumes" Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.275395 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5255cba1-2ca7-460a-b112-28aa45156734" path="/var/lib/kubelet/pods/5255cba1-2ca7-460a-b112-28aa45156734/volumes" Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.277643 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a31f27e-dadb-461c-a614-77cc108a550f" path="/var/lib/kubelet/pods/5a31f27e-dadb-461c-a614-77cc108a550f/volumes" Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.278875 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60fa3398-4cf5-4247-b31d-f08de7692fa2" path="/var/lib/kubelet/pods/60fa3398-4cf5-4247-b31d-f08de7692fa2/volumes" Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.280048 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9dd596-cbe0-4c2f-9024-e4724af56387" path="/var/lib/kubelet/pods/7f9dd596-cbe0-4c2f-9024-e4724af56387/volumes" Feb 27 06:40:40 crc kubenswrapper[4725]: I0227 06:40:40.282216 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4381525-a993-4f94-8f82-7ce47ca8e67e" path="/var/lib/kubelet/pods/e4381525-a993-4f94-8f82-7ce47ca8e67e/volumes" Feb 27 06:40:47 crc kubenswrapper[4725]: I0227 06:40:47.252864 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:40:47 crc kubenswrapper[4725]: E0227 06:40:47.254055 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:41:00 crc kubenswrapper[4725]: I0227 06:41:00.073196 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-g78th"] Feb 27 06:41:00 crc kubenswrapper[4725]: I0227 06:41:00.090170 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-g78th"] Feb 27 06:41:00 crc kubenswrapper[4725]: I0227 06:41:00.271615 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac537af-ac73-4c4e-947a-cc2120ccb158" path="/var/lib/kubelet/pods/dac537af-ac73-4c4e-947a-cc2120ccb158/volumes" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.285629 4725 scope.go:117] "RemoveContainer" containerID="52516dca89d290292980c8f2a21d7017c54e8c82475cf671ae9de2fe70912d42" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.314912 4725 scope.go:117] "RemoveContainer" containerID="aa0da31d84c32f8300d73b512a897ddf676a53cbe50b4a12a60296f956535ed3" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.360711 4725 scope.go:117] "RemoveContainer" containerID="26ccfd0f078e62fc6d4642073074b7d790ef772c1cc39bcca3cf9f1e83489707" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.422937 4725 scope.go:117] "RemoveContainer" containerID="dc22c3ac827db7656bbb3e966573510d429daac993b5244e894dd42018760a1e" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.456806 4725 scope.go:117] "RemoveContainer" containerID="d0c135272a567fbf7a4905407edf1c7d047f92a605da1cf0b76eb7cec17e838a" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.512769 4725 scope.go:117] "RemoveContainer" containerID="40639ff8b1fe06a901d046c91fbd26ae36baaa256aa83ee36f6663df215bcf9a" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.549689 4725 scope.go:117] "RemoveContainer" containerID="bbf3f84458346ea8d67869d214c8a8e33f58e11e9c444df242e31b064707c2da" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.565697 4725 scope.go:117] "RemoveContainer" containerID="7e07b91b2eb6e5b943fa682856292559ea3dd492925584930d492475ca4fb8ef" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.585376 4725 scope.go:117] "RemoveContainer" containerID="555a6cc9bddc3d9b22d43b96773f9ca7dea7bf645367ed4567bfe4774e49e6ff" Feb 27 06:41:01 crc kubenswrapper[4725]: I0227 06:41:01.625851 4725 scope.go:117] "RemoveContainer" containerID="9ac8cd03bc4170645b1e7db4df6a722aa4ff22d813c9df2e59edb9dd8dc28e01" Feb 27 06:41:02 crc kubenswrapper[4725]: I0227 06:41:02.267597 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:41:02 crc kubenswrapper[4725]: E0227 06:41:02.268162 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:41:05 crc kubenswrapper[4725]: I0227 06:41:05.045657 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6nmms"] Feb 27 06:41:05 crc kubenswrapper[4725]: I0227 06:41:05.065649 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6nmms"] Feb 27 06:41:06 crc kubenswrapper[4725]: I0227 06:41:06.270464 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe3bf48-3f87-4630-b879-65a1614acb41" path="/var/lib/kubelet/pods/fbe3bf48-3f87-4630-b879-65a1614acb41/volumes" Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.048493 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e038-account-create-update-7k69g"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.062630 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2q7cz"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.075514 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-063b-account-create-update-tz8vx"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.086544 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-877a-account-create-update-cn8jq"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.095574 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fsxct"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.113140 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e038-account-create-update-7k69g"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.126956 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-063b-account-create-update-tz8vx"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.139911 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-877a-account-create-update-cn8jq"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.149977 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fsxct"] Feb 27 06:41:09 crc kubenswrapper[4725]: I0227 06:41:09.157919 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2q7cz"] Feb 27 06:41:10 crc kubenswrapper[4725]: I0227 06:41:10.269796 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd9edff-02fe-4305-81b9-ee8fbea78f20" path="/var/lib/kubelet/pods/1cd9edff-02fe-4305-81b9-ee8fbea78f20/volumes" Feb 27 06:41:10 crc kubenswrapper[4725]: I0227 06:41:10.271203 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78811b9d-d814-4266-911b-9c466a6df5e4" path="/var/lib/kubelet/pods/78811b9d-d814-4266-911b-9c466a6df5e4/volumes" Feb 27 06:41:10 crc kubenswrapper[4725]: I0227 06:41:10.272534 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94647b2a-f9bf-498e-bb33-ca18ee334284" path="/var/lib/kubelet/pods/94647b2a-f9bf-498e-bb33-ca18ee334284/volumes" Feb 27 06:41:10 crc kubenswrapper[4725]: I0227 06:41:10.274446 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3424ee2-48c0-4904-a435-0acde28a6043" path="/var/lib/kubelet/pods/a3424ee2-48c0-4904-a435-0acde28a6043/volumes" Feb 27 06:41:10 crc kubenswrapper[4725]: I0227 06:41:10.276031 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6df5742-cb9e-4731-aa27-0efe73e9e61a" path="/var/lib/kubelet/pods/e6df5742-cb9e-4731-aa27-0efe73e9e61a/volumes" Feb 27 06:41:12 crc kubenswrapper[4725]: I0227 06:41:12.042633 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-98cpk"] Feb 27 06:41:12 crc kubenswrapper[4725]: I0227 06:41:12.064004 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-98cpk"] Feb 27 06:41:12 crc kubenswrapper[4725]: I0227 06:41:12.267142 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6" path="/var/lib/kubelet/pods/dfff8cd0-2933-4c5b-bd1e-a9a4e380a7b6/volumes" Feb 27 06:41:16 crc kubenswrapper[4725]: I0227 06:41:16.251983 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:41:16 crc kubenswrapper[4725]: E0227 06:41:16.252865 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:41:20 crc kubenswrapper[4725]: I0227 06:41:20.045067 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-c7txp"] Feb 27 06:41:20 crc kubenswrapper[4725]: I0227 06:41:20.064744 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-c7txp"] Feb 27 06:41:20 crc kubenswrapper[4725]: I0227 06:41:20.267692 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be03cf3c-5ffa-40cd-9a69-cb386068bc2c" path="/var/lib/kubelet/pods/be03cf3c-5ffa-40cd-9a69-cb386068bc2c/volumes" Feb 27 06:41:21 crc kubenswrapper[4725]: I0227 06:41:21.040233 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vvwlj"] Feb 27 06:41:21 crc kubenswrapper[4725]: I0227 06:41:21.055153 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vvwlj"] Feb 27 06:41:22 crc kubenswrapper[4725]: I0227 06:41:22.264943 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df17b144-75a9-44a8-a2b4-4694687dc01f" path="/var/lib/kubelet/pods/df17b144-75a9-44a8-a2b4-4694687dc01f/volumes" Feb 27 06:41:31 crc kubenswrapper[4725]: I0227 06:41:31.251130 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:41:31 crc kubenswrapper[4725]: E0227 06:41:31.252150 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:41:43 crc kubenswrapper[4725]: I0227 06:41:43.337471 4725 generic.go:334] "Generic (PLEG): container finished" podID="1fab6c47-9849-428c-96a3-96c4cac71f69" containerID="aa6afdc9aebe31bbe996ab1b1f7205a941baefb9ca7fe41132c0cffc2b681889" exitCode=0 Feb 27 06:41:43 crc kubenswrapper[4725]: I0227 06:41:43.337579 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" event={"ID":"1fab6c47-9849-428c-96a3-96c4cac71f69","Type":"ContainerDied","Data":"aa6afdc9aebe31bbe996ab1b1f7205a941baefb9ca7fe41132c0cffc2b681889"} Feb 27 06:41:44 crc kubenswrapper[4725]: I0227 06:41:44.837707 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:41:44 crc kubenswrapper[4725]: I0227 06:41:44.961307 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95tjq\" (UniqueName: \"kubernetes.io/projected/1fab6c47-9849-428c-96a3-96c4cac71f69-kube-api-access-95tjq\") pod \"1fab6c47-9849-428c-96a3-96c4cac71f69\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " Feb 27 06:41:44 crc kubenswrapper[4725]: I0227 06:41:44.961533 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-inventory\") pod \"1fab6c47-9849-428c-96a3-96c4cac71f69\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " Feb 27 06:41:44 crc kubenswrapper[4725]: I0227 06:41:44.961744 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-ssh-key-openstack-edpm-ipam\") pod \"1fab6c47-9849-428c-96a3-96c4cac71f69\" (UID: \"1fab6c47-9849-428c-96a3-96c4cac71f69\") " Feb 27 06:41:44 crc kubenswrapper[4725]: I0227 06:41:44.971557 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fab6c47-9849-428c-96a3-96c4cac71f69-kube-api-access-95tjq" (OuterVolumeSpecName: "kube-api-access-95tjq") pod "1fab6c47-9849-428c-96a3-96c4cac71f69" (UID: "1fab6c47-9849-428c-96a3-96c4cac71f69"). InnerVolumeSpecName "kube-api-access-95tjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.004882 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-inventory" (OuterVolumeSpecName: "inventory") pod "1fab6c47-9849-428c-96a3-96c4cac71f69" (UID: "1fab6c47-9849-428c-96a3-96c4cac71f69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.011801 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1fab6c47-9849-428c-96a3-96c4cac71f69" (UID: "1fab6c47-9849-428c-96a3-96c4cac71f69"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.064726 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95tjq\" (UniqueName: \"kubernetes.io/projected/1fab6c47-9849-428c-96a3-96c4cac71f69-kube-api-access-95tjq\") on node \"crc\" DevicePath \"\"" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.064832 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.064856 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1fab6c47-9849-428c-96a3-96c4cac71f69-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.362159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" event={"ID":"1fab6c47-9849-428c-96a3-96c4cac71f69","Type":"ContainerDied","Data":"d6c8c613f29ee197fb3140559af8692372b8da030413be5a05ab37f38237916b"} Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.362507 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c8c613f29ee197fb3140559af8692372b8da030413be5a05ab37f38237916b" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.362386 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.487858 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v"] Feb 27 06:41:45 crc kubenswrapper[4725]: E0227 06:41:45.488330 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fab6c47-9849-428c-96a3-96c4cac71f69" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.488352 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fab6c47-9849-428c-96a3-96c4cac71f69" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.488589 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fab6c47-9849-428c-96a3-96c4cac71f69" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.489273 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.492620 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.492675 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.493243 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.493705 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.509555 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v"] Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.678151 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.678248 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.678278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm29v\" (UniqueName: \"kubernetes.io/projected/311ba5d5-8172-405b-aead-458a7149e826-kube-api-access-dm29v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.779861 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.779960 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.779991 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm29v\" (UniqueName: \"kubernetes.io/projected/311ba5d5-8172-405b-aead-458a7149e826-kube-api-access-dm29v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.784430 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.786232 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.800052 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm29v\" (UniqueName: \"kubernetes.io/projected/311ba5d5-8172-405b-aead-458a7149e826-kube-api-access-dm29v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pj85v\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:45 crc kubenswrapper[4725]: I0227 06:41:45.809810 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:41:46 crc kubenswrapper[4725]: I0227 06:41:46.157779 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v"] Feb 27 06:41:46 crc kubenswrapper[4725]: I0227 06:41:46.163903 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:41:46 crc kubenswrapper[4725]: I0227 06:41:46.251966 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:41:46 crc kubenswrapper[4725]: I0227 06:41:46.375317 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" event={"ID":"311ba5d5-8172-405b-aead-458a7149e826","Type":"ContainerStarted","Data":"ad106a46acf000a28b9c983e5f0df86fe38cbb0cef87ba9b76320210c2554a9a"} Feb 27 06:41:47 crc kubenswrapper[4725]: I0227 06:41:47.385824 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" event={"ID":"311ba5d5-8172-405b-aead-458a7149e826","Type":"ContainerStarted","Data":"582a023f5292e1d89d4ea6f564e320c6f7cef5ae34d3f6214fc3df7baf4078ea"} Feb 27 06:41:47 crc kubenswrapper[4725]: I0227 06:41:47.389269 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"0b501af946b1ad795a1755f721e8e4751f676f233bfcb14cf18cba37c4f4ffaf"} Feb 27 06:41:47 crc kubenswrapper[4725]: I0227 06:41:47.413173 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" podStartSLOduration=1.564673565 podStartE2EDuration="2.413155505s" podCreationTimestamp="2026-02-27 06:41:45 +0000 UTC" firstStartedPulling="2026-02-27 06:41:46.163614693 +0000 UTC m=+1884.626235272" lastFinishedPulling="2026-02-27 06:41:47.012096603 +0000 UTC m=+1885.474717212" observedRunningTime="2026-02-27 06:41:47.407389731 +0000 UTC m=+1885.870010330" watchObservedRunningTime="2026-02-27 06:41:47.413155505 +0000 UTC m=+1885.875776084" Feb 27 06:41:55 crc kubenswrapper[4725]: I0227 06:41:55.057448 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bvhmj"] Feb 27 06:41:55 crc kubenswrapper[4725]: I0227 06:41:55.070997 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bvhmj"] Feb 27 06:41:56 crc kubenswrapper[4725]: I0227 06:41:56.268427 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab103cbe-a833-4f47-8101-d9ea92afe59c" path="/var/lib/kubelet/pods/ab103cbe-a833-4f47-8101-d9ea92afe59c/volumes" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.152913 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536242-pjw6w"] Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.155056 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536242-pjw6w" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.157906 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.159243 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.161467 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.174921 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536242-pjw6w"] Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.345309 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62rqn\" (UniqueName: \"kubernetes.io/projected/e2fedfdb-d283-478e-bc9a-7f225d1a6d63-kube-api-access-62rqn\") pod \"auto-csr-approver-29536242-pjw6w\" (UID: \"e2fedfdb-d283-478e-bc9a-7f225d1a6d63\") " pod="openshift-infra/auto-csr-approver-29536242-pjw6w" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.447650 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62rqn\" (UniqueName: \"kubernetes.io/projected/e2fedfdb-d283-478e-bc9a-7f225d1a6d63-kube-api-access-62rqn\") pod \"auto-csr-approver-29536242-pjw6w\" (UID: \"e2fedfdb-d283-478e-bc9a-7f225d1a6d63\") " pod="openshift-infra/auto-csr-approver-29536242-pjw6w" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.474955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62rqn\" (UniqueName: \"kubernetes.io/projected/e2fedfdb-d283-478e-bc9a-7f225d1a6d63-kube-api-access-62rqn\") pod \"auto-csr-approver-29536242-pjw6w\" (UID: \"e2fedfdb-d283-478e-bc9a-7f225d1a6d63\") " pod="openshift-infra/auto-csr-approver-29536242-pjw6w" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.509206 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536242-pjw6w" Feb 27 06:42:00 crc kubenswrapper[4725]: I0227 06:42:00.981769 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536242-pjw6w"] Feb 27 06:42:01 crc kubenswrapper[4725]: I0227 06:42:01.543300 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536242-pjw6w" event={"ID":"e2fedfdb-d283-478e-bc9a-7f225d1a6d63","Type":"ContainerStarted","Data":"fca8bb973376ce5612ee5b1ff8812b304b0c97ada0b315d8e4d91f8a0cccb73a"} Feb 27 06:42:01 crc kubenswrapper[4725]: I0227 06:42:01.803922 4725 scope.go:117] "RemoveContainer" containerID="b58ca8ecb6430f2c11c2126e975e12898b529c6d118421061806aab48926540d" Feb 27 06:42:01 crc kubenswrapper[4725]: I0227 06:42:01.857755 4725 scope.go:117] "RemoveContainer" containerID="ecf5e7d116b70a689552aedc8937b97ef37d881dc82aa7e9d0b29abbc1ac8ac8" Feb 27 06:42:01 crc kubenswrapper[4725]: I0227 06:42:01.908172 4725 scope.go:117] "RemoveContainer" containerID="fa80896acee4d9c5698973586823ba4b9b7144dc91d808d9a17bad162b28c1d0" Feb 27 06:42:01 crc kubenswrapper[4725]: I0227 06:42:01.951729 4725 scope.go:117] "RemoveContainer" containerID="0a5141fe3ac64466dc1c51f41f414fc9e051846f3141d391347f54f79c61d3ea" Feb 27 06:42:01 crc kubenswrapper[4725]: I0227 06:42:01.984963 4725 scope.go:117] "RemoveContainer" containerID="8d28ad203b2aeaee1320d261fb493db88dc11595d7154df0162195fba412ad0c" Feb 27 06:42:02 crc kubenswrapper[4725]: I0227 06:42:02.049889 4725 scope.go:117] "RemoveContainer" containerID="5aa334da6e28dba02b940326dbb9e9753d6bb49e62dd188b73f4ab0f088d367e" Feb 27 06:42:02 crc kubenswrapper[4725]: I0227 06:42:02.082992 4725 scope.go:117] "RemoveContainer" containerID="47bbf6f0345636425feb69b0475a99faa69fcbb5d7cc1c7b03e5806a70973cab" Feb 27 06:42:02 crc kubenswrapper[4725]: I0227 06:42:02.112108 4725 scope.go:117] "RemoveContainer" containerID="ba300aa808522a64bffe439dff7f8181467f6abc326bc4fa504121289b71f496" Feb 27 06:42:02 crc kubenswrapper[4725]: I0227 06:42:02.150466 4725 scope.go:117] "RemoveContainer" containerID="9f1a737ef7eb69e0b3c45c4647b1d010e348982a9badebd73a266343fc8663c9" Feb 27 06:42:02 crc kubenswrapper[4725]: I0227 06:42:02.173134 4725 scope.go:117] "RemoveContainer" containerID="0eb7dcd26fc725bba4ccc650386b5e042f0ad7517425afbe2b83666767456303" Feb 27 06:42:03 crc kubenswrapper[4725]: I0227 06:42:03.055418 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fxbvl"] Feb 27 06:42:03 crc kubenswrapper[4725]: I0227 06:42:03.070723 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fxbvl"] Feb 27 06:42:03 crc kubenswrapper[4725]: I0227 06:42:03.571423 4725 generic.go:334] "Generic (PLEG): container finished" podID="e2fedfdb-d283-478e-bc9a-7f225d1a6d63" containerID="65b91a8b96cb888344c913d6927f7a13e950b1645d22a6d9c54d96ea2bcfe527" exitCode=0 Feb 27 06:42:03 crc kubenswrapper[4725]: I0227 06:42:03.571526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536242-pjw6w" event={"ID":"e2fedfdb-d283-478e-bc9a-7f225d1a6d63","Type":"ContainerDied","Data":"65b91a8b96cb888344c913d6927f7a13e950b1645d22a6d9c54d96ea2bcfe527"} Feb 27 06:42:04 crc kubenswrapper[4725]: I0227 06:42:04.271654 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b95598a-2902-4372-b9f4-a40152f1c45f" path="/var/lib/kubelet/pods/4b95598a-2902-4372-b9f4-a40152f1c45f/volumes" Feb 27 06:42:04 crc kubenswrapper[4725]: I0227 06:42:04.971159 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536242-pjw6w" Feb 27 06:42:05 crc kubenswrapper[4725]: I0227 06:42:05.141881 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62rqn\" (UniqueName: \"kubernetes.io/projected/e2fedfdb-d283-478e-bc9a-7f225d1a6d63-kube-api-access-62rqn\") pod \"e2fedfdb-d283-478e-bc9a-7f225d1a6d63\" (UID: \"e2fedfdb-d283-478e-bc9a-7f225d1a6d63\") " Feb 27 06:42:05 crc kubenswrapper[4725]: I0227 06:42:05.155692 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fedfdb-d283-478e-bc9a-7f225d1a6d63-kube-api-access-62rqn" (OuterVolumeSpecName: "kube-api-access-62rqn") pod "e2fedfdb-d283-478e-bc9a-7f225d1a6d63" (UID: "e2fedfdb-d283-478e-bc9a-7f225d1a6d63"). InnerVolumeSpecName "kube-api-access-62rqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:42:05 crc kubenswrapper[4725]: I0227 06:42:05.245693 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62rqn\" (UniqueName: \"kubernetes.io/projected/e2fedfdb-d283-478e-bc9a-7f225d1a6d63-kube-api-access-62rqn\") on node \"crc\" DevicePath \"\"" Feb 27 06:42:05 crc kubenswrapper[4725]: I0227 06:42:05.600035 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536242-pjw6w" event={"ID":"e2fedfdb-d283-478e-bc9a-7f225d1a6d63","Type":"ContainerDied","Data":"fca8bb973376ce5612ee5b1ff8812b304b0c97ada0b315d8e4d91f8a0cccb73a"} Feb 27 06:42:05 crc kubenswrapper[4725]: I0227 06:42:05.600326 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fca8bb973376ce5612ee5b1ff8812b304b0c97ada0b315d8e4d91f8a0cccb73a" Feb 27 06:42:05 crc kubenswrapper[4725]: I0227 06:42:05.600140 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536242-pjw6w" Feb 27 06:42:06 crc kubenswrapper[4725]: I0227 06:42:06.042195 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536236-6c6mp"] Feb 27 06:42:06 crc kubenswrapper[4725]: I0227 06:42:06.056461 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536236-6c6mp"] Feb 27 06:42:06 crc kubenswrapper[4725]: I0227 06:42:06.264389 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eef3217-3389-48e0-8aa1-e6017e20258d" path="/var/lib/kubelet/pods/5eef3217-3389-48e0-8aa1-e6017e20258d/volumes" Feb 27 06:42:08 crc kubenswrapper[4725]: I0227 06:42:08.050963 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-c7fnv"] Feb 27 06:42:08 crc kubenswrapper[4725]: I0227 06:42:08.059119 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-c7fnv"] Feb 27 06:42:08 crc kubenswrapper[4725]: I0227 06:42:08.268807 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab5820b-1151-4cc9-ae7b-09b596335d88" path="/var/lib/kubelet/pods/9ab5820b-1151-4cc9-ae7b-09b596335d88/volumes" Feb 27 06:42:19 crc kubenswrapper[4725]: I0227 06:42:19.027995 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zb7cc"] Feb 27 06:42:19 crc kubenswrapper[4725]: I0227 06:42:19.042976 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zb7cc"] Feb 27 06:42:20 crc kubenswrapper[4725]: I0227 06:42:20.269101 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617c62fd-dee8-4bab-a69a-8f348c8487a3" path="/var/lib/kubelet/pods/617c62fd-dee8-4bab-a69a-8f348c8487a3/volumes" Feb 27 06:42:21 crc kubenswrapper[4725]: I0227 06:42:21.030671 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-5v8pq"] Feb 27 06:42:21 crc kubenswrapper[4725]: I0227 06:42:21.046555 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-5v8pq"] Feb 27 06:42:22 crc kubenswrapper[4725]: I0227 06:42:22.268024 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a306be2-547c-404f-afc1-4f4639cf7a28" path="/var/lib/kubelet/pods/6a306be2-547c-404f-afc1-4f4639cf7a28/volumes" Feb 27 06:42:55 crc kubenswrapper[4725]: I0227 06:42:55.146453 4725 generic.go:334] "Generic (PLEG): container finished" podID="311ba5d5-8172-405b-aead-458a7149e826" containerID="582a023f5292e1d89d4ea6f564e320c6f7cef5ae34d3f6214fc3df7baf4078ea" exitCode=0 Feb 27 06:42:55 crc kubenswrapper[4725]: I0227 06:42:55.146606 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" event={"ID":"311ba5d5-8172-405b-aead-458a7149e826","Type":"ContainerDied","Data":"582a023f5292e1d89d4ea6f564e320c6f7cef5ae34d3f6214fc3df7baf4078ea"} Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.658693 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.848381 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-inventory\") pod \"311ba5d5-8172-405b-aead-458a7149e826\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.848467 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm29v\" (UniqueName: \"kubernetes.io/projected/311ba5d5-8172-405b-aead-458a7149e826-kube-api-access-dm29v\") pod \"311ba5d5-8172-405b-aead-458a7149e826\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.848630 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-ssh-key-openstack-edpm-ipam\") pod \"311ba5d5-8172-405b-aead-458a7149e826\" (UID: \"311ba5d5-8172-405b-aead-458a7149e826\") " Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.857119 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311ba5d5-8172-405b-aead-458a7149e826-kube-api-access-dm29v" (OuterVolumeSpecName: "kube-api-access-dm29v") pod "311ba5d5-8172-405b-aead-458a7149e826" (UID: "311ba5d5-8172-405b-aead-458a7149e826"). InnerVolumeSpecName "kube-api-access-dm29v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.884329 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "311ba5d5-8172-405b-aead-458a7149e826" (UID: "311ba5d5-8172-405b-aead-458a7149e826"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.892143 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-inventory" (OuterVolumeSpecName: "inventory") pod "311ba5d5-8172-405b-aead-458a7149e826" (UID: "311ba5d5-8172-405b-aead-458a7149e826"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.952803 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.952910 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm29v\" (UniqueName: \"kubernetes.io/projected/311ba5d5-8172-405b-aead-458a7149e826-kube-api-access-dm29v\") on node \"crc\" DevicePath \"\"" Feb 27 06:42:56 crc kubenswrapper[4725]: I0227 06:42:56.952989 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/311ba5d5-8172-405b-aead-458a7149e826-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.184660 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" event={"ID":"311ba5d5-8172-405b-aead-458a7149e826","Type":"ContainerDied","Data":"ad106a46acf000a28b9c983e5f0df86fe38cbb0cef87ba9b76320210c2554a9a"} Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.184714 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad106a46acf000a28b9c983e5f0df86fe38cbb0cef87ba9b76320210c2554a9a" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.184819 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pj85v" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.284527 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p"] Feb 27 06:42:57 crc kubenswrapper[4725]: E0227 06:42:57.285002 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311ba5d5-8172-405b-aead-458a7149e826" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.285018 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="311ba5d5-8172-405b-aead-458a7149e826" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 06:42:57 crc kubenswrapper[4725]: E0227 06:42:57.285052 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2fedfdb-d283-478e-bc9a-7f225d1a6d63" containerName="oc" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.285060 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fedfdb-d283-478e-bc9a-7f225d1a6d63" containerName="oc" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.285314 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2fedfdb-d283-478e-bc9a-7f225d1a6d63" containerName="oc" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.285330 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="311ba5d5-8172-405b-aead-458a7149e826" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.286241 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.290559 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.291069 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.291402 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.291672 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.295161 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p"] Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.465613 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw678\" (UniqueName: \"kubernetes.io/projected/308aa3d5-1a73-49da-98ae-a723be6a9c31-kube-api-access-cw678\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.465684 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.465717 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.567727 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw678\" (UniqueName: \"kubernetes.io/projected/308aa3d5-1a73-49da-98ae-a723be6a9c31-kube-api-access-cw678\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.567887 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.567946 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.572893 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.577909 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.612014 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw678\" (UniqueName: \"kubernetes.io/projected/308aa3d5-1a73-49da-98ae-a723be6a9c31-kube-api-access-cw678\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-85q2p\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:57 crc kubenswrapper[4725]: I0227 06:42:57.906103 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:42:58 crc kubenswrapper[4725]: I0227 06:42:58.520546 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p"] Feb 27 06:42:59 crc kubenswrapper[4725]: I0227 06:42:59.203043 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" event={"ID":"308aa3d5-1a73-49da-98ae-a723be6a9c31","Type":"ContainerStarted","Data":"4b3905908c65f0710b6ee57b1499cdcdde4c910ebc775b54b453cc0ced969e23"} Feb 27 06:43:00 crc kubenswrapper[4725]: I0227 06:43:00.215946 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" event={"ID":"308aa3d5-1a73-49da-98ae-a723be6a9c31","Type":"ContainerStarted","Data":"83b366a1ae0faace58840c429385396a9bc7311e8d9305d2bbbf8b9bc9ece783"} Feb 27 06:43:00 crc kubenswrapper[4725]: I0227 06:43:00.244723 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" podStartSLOduration=2.749687251 podStartE2EDuration="3.244693484s" podCreationTimestamp="2026-02-27 06:42:57 +0000 UTC" firstStartedPulling="2026-02-27 06:42:58.524668465 +0000 UTC m=+1956.987289034" lastFinishedPulling="2026-02-27 06:42:59.019674658 +0000 UTC m=+1957.482295267" observedRunningTime="2026-02-27 06:43:00.234818223 +0000 UTC m=+1958.697438802" watchObservedRunningTime="2026-02-27 06:43:00.244693484 +0000 UTC m=+1958.707314063" Feb 27 06:43:02 crc kubenswrapper[4725]: I0227 06:43:02.409968 4725 scope.go:117] "RemoveContainer" containerID="cb419507ea31548cd9839a0593ffb565ab0a0f14c6be351d8555d15864a47637" Feb 27 06:43:02 crc kubenswrapper[4725]: I0227 06:43:02.452419 4725 scope.go:117] "RemoveContainer" containerID="8794d1147b6b8db32bd737d441e4fc901912bda9cdc0753005c03ee0e1534914" Feb 27 06:43:02 crc kubenswrapper[4725]: I0227 06:43:02.493124 4725 scope.go:117] "RemoveContainer" containerID="84f332f8690913a03ca370eeaa881c97db72381090f14bef26d6fa16a82344ff" Feb 27 06:43:02 crc kubenswrapper[4725]: I0227 06:43:02.567695 4725 scope.go:117] "RemoveContainer" containerID="c98c707a97476f9732720cd0fa8f34bbf5dff542eff9b9b66ded3a073e396187" Feb 27 06:43:02 crc kubenswrapper[4725]: I0227 06:43:02.633321 4725 scope.go:117] "RemoveContainer" containerID="4f405d997ffaff8df8f72728e5663cade2aff7b899d3f9ed93a8cb96bbd91b7d" Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.056736 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-46fc-account-create-update-sp56g"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.064548 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-m6gzd"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.072584 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mz255"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.080916 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tzhf5"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.088717 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fd8c-account-create-update-tn48s"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.097516 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3cc6-account-create-update-8fxff"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.107269 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-46fc-account-create-update-sp56g"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.114506 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-m6gzd"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.122192 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fd8c-account-create-update-tn48s"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.130052 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mz255"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.137840 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3cc6-account-create-update-8fxff"] Feb 27 06:43:03 crc kubenswrapper[4725]: I0227 06:43:03.145602 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tzhf5"] Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.257626 4725 generic.go:334] "Generic (PLEG): container finished" podID="308aa3d5-1a73-49da-98ae-a723be6a9c31" containerID="83b366a1ae0faace58840c429385396a9bc7311e8d9305d2bbbf8b9bc9ece783" exitCode=0 Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.266640 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2e80d4-9806-45a7-b10e-91d387331e54" path="/var/lib/kubelet/pods/6c2e80d4-9806-45a7-b10e-91d387331e54/volumes" Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.267197 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87486239-3017-44ed-ba9d-a28541bb2aca" path="/var/lib/kubelet/pods/87486239-3017-44ed-ba9d-a28541bb2aca/volumes" Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.267799 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="903b7538-3fc0-4580-9bd0-adff6ce3f634" path="/var/lib/kubelet/pods/903b7538-3fc0-4580-9bd0-adff6ce3f634/volumes" Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.268970 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af23271e-c19f-475c-8ff7-51e9bbe4471e" path="/var/lib/kubelet/pods/af23271e-c19f-475c-8ff7-51e9bbe4471e/volumes" Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.269933 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28e6b4f-6ef4-4e8f-9b40-366064eec781" path="/var/lib/kubelet/pods/c28e6b4f-6ef4-4e8f-9b40-366064eec781/volumes" Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.270446 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe9b6e7-e6f5-4255-ab87-5c42cc89963a" path="/var/lib/kubelet/pods/dfe9b6e7-e6f5-4255-ab87-5c42cc89963a/volumes" Feb 27 06:43:04 crc kubenswrapper[4725]: I0227 06:43:04.270912 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" event={"ID":"308aa3d5-1a73-49da-98ae-a723be6a9c31","Type":"ContainerDied","Data":"83b366a1ae0faace58840c429385396a9bc7311e8d9305d2bbbf8b9bc9ece783"} Feb 27 06:43:05 crc kubenswrapper[4725]: I0227 06:43:05.775564 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:43:05 crc kubenswrapper[4725]: I0227 06:43:05.954876 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-ssh-key-openstack-edpm-ipam\") pod \"308aa3d5-1a73-49da-98ae-a723be6a9c31\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " Feb 27 06:43:05 crc kubenswrapper[4725]: I0227 06:43:05.955147 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw678\" (UniqueName: \"kubernetes.io/projected/308aa3d5-1a73-49da-98ae-a723be6a9c31-kube-api-access-cw678\") pod \"308aa3d5-1a73-49da-98ae-a723be6a9c31\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " Feb 27 06:43:05 crc kubenswrapper[4725]: I0227 06:43:05.955357 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-inventory\") pod \"308aa3d5-1a73-49da-98ae-a723be6a9c31\" (UID: \"308aa3d5-1a73-49da-98ae-a723be6a9c31\") " Feb 27 06:43:05 crc kubenswrapper[4725]: I0227 06:43:05.960851 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308aa3d5-1a73-49da-98ae-a723be6a9c31-kube-api-access-cw678" (OuterVolumeSpecName: "kube-api-access-cw678") pod "308aa3d5-1a73-49da-98ae-a723be6a9c31" (UID: "308aa3d5-1a73-49da-98ae-a723be6a9c31"). InnerVolumeSpecName "kube-api-access-cw678". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:43:05 crc kubenswrapper[4725]: I0227 06:43:05.988091 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-inventory" (OuterVolumeSpecName: "inventory") pod "308aa3d5-1a73-49da-98ae-a723be6a9c31" (UID: "308aa3d5-1a73-49da-98ae-a723be6a9c31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:43:05 crc kubenswrapper[4725]: I0227 06:43:05.991455 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "308aa3d5-1a73-49da-98ae-a723be6a9c31" (UID: "308aa3d5-1a73-49da-98ae-a723be6a9c31"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.058390 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw678\" (UniqueName: \"kubernetes.io/projected/308aa3d5-1a73-49da-98ae-a723be6a9c31-kube-api-access-cw678\") on node \"crc\" DevicePath \"\"" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.058445 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.058462 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/308aa3d5-1a73-49da-98ae-a723be6a9c31-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.280500 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" event={"ID":"308aa3d5-1a73-49da-98ae-a723be6a9c31","Type":"ContainerDied","Data":"4b3905908c65f0710b6ee57b1499cdcdde4c910ebc775b54b453cc0ced969e23"} Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.280545 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b3905908c65f0710b6ee57b1499cdcdde4c910ebc775b54b453cc0ced969e23" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.280587 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-85q2p" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.389109 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r"] Feb 27 06:43:06 crc kubenswrapper[4725]: E0227 06:43:06.389644 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308aa3d5-1a73-49da-98ae-a723be6a9c31" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.389670 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="308aa3d5-1a73-49da-98ae-a723be6a9c31" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.389960 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="308aa3d5-1a73-49da-98ae-a723be6a9c31" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.390825 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.392983 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.393142 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.396822 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.397084 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.419454 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r"] Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.570435 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st4zw\" (UniqueName: \"kubernetes.io/projected/bf027694-e689-4cb8-aaf6-3e848ec2de4b-kube-api-access-st4zw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.570525 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.570559 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.672727 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.672775 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.672943 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st4zw\" (UniqueName: \"kubernetes.io/projected/bf027694-e689-4cb8-aaf6-3e848ec2de4b-kube-api-access-st4zw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.680460 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.684912 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.697221 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st4zw\" (UniqueName: \"kubernetes.io/projected/bf027694-e689-4cb8-aaf6-3e848ec2de4b-kube-api-access-st4zw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-64s2r\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:06 crc kubenswrapper[4725]: I0227 06:43:06.714162 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:07 crc kubenswrapper[4725]: I0227 06:43:07.244355 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r"] Feb 27 06:43:07 crc kubenswrapper[4725]: I0227 06:43:07.292006 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" event={"ID":"bf027694-e689-4cb8-aaf6-3e848ec2de4b","Type":"ContainerStarted","Data":"6d2108900061a436020b1f1321f289871895a69e24685519b3282dc5a28f9b85"} Feb 27 06:43:08 crc kubenswrapper[4725]: I0227 06:43:08.301977 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" event={"ID":"bf027694-e689-4cb8-aaf6-3e848ec2de4b","Type":"ContainerStarted","Data":"59169d27c7123ae218e56ec242dd353bd31aff705236bf47964a6bbe4a98da66"} Feb 27 06:43:08 crc kubenswrapper[4725]: I0227 06:43:08.318197 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" podStartSLOduration=1.898013948 podStartE2EDuration="2.318179503s" podCreationTimestamp="2026-02-27 06:43:06 +0000 UTC" firstStartedPulling="2026-02-27 06:43:07.257783787 +0000 UTC m=+1965.720404366" lastFinishedPulling="2026-02-27 06:43:07.677949352 +0000 UTC m=+1966.140569921" observedRunningTime="2026-02-27 06:43:08.315832906 +0000 UTC m=+1966.778453495" watchObservedRunningTime="2026-02-27 06:43:08.318179503 +0000 UTC m=+1966.780800072" Feb 27 06:43:39 crc kubenswrapper[4725]: I0227 06:43:39.050500 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kjl9w"] Feb 27 06:43:39 crc kubenswrapper[4725]: I0227 06:43:39.060894 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kjl9w"] Feb 27 06:43:40 crc kubenswrapper[4725]: I0227 06:43:40.265452 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010" path="/var/lib/kubelet/pods/56c5dc31-7f88-4c7a-9f3a-e8dbfe4b7010/volumes" Feb 27 06:43:46 crc kubenswrapper[4725]: I0227 06:43:46.729283 4725 generic.go:334] "Generic (PLEG): container finished" podID="bf027694-e689-4cb8-aaf6-3e848ec2de4b" containerID="59169d27c7123ae218e56ec242dd353bd31aff705236bf47964a6bbe4a98da66" exitCode=0 Feb 27 06:43:46 crc kubenswrapper[4725]: I0227 06:43:46.729361 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" event={"ID":"bf027694-e689-4cb8-aaf6-3e848ec2de4b","Type":"ContainerDied","Data":"59169d27c7123ae218e56ec242dd353bd31aff705236bf47964a6bbe4a98da66"} Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.183482 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.339533 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st4zw\" (UniqueName: \"kubernetes.io/projected/bf027694-e689-4cb8-aaf6-3e848ec2de4b-kube-api-access-st4zw\") pod \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.339708 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-inventory\") pod \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.340081 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-ssh-key-openstack-edpm-ipam\") pod \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\" (UID: \"bf027694-e689-4cb8-aaf6-3e848ec2de4b\") " Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.348508 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf027694-e689-4cb8-aaf6-3e848ec2de4b-kube-api-access-st4zw" (OuterVolumeSpecName: "kube-api-access-st4zw") pod "bf027694-e689-4cb8-aaf6-3e848ec2de4b" (UID: "bf027694-e689-4cb8-aaf6-3e848ec2de4b"). InnerVolumeSpecName "kube-api-access-st4zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.380544 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-inventory" (OuterVolumeSpecName: "inventory") pod "bf027694-e689-4cb8-aaf6-3e848ec2de4b" (UID: "bf027694-e689-4cb8-aaf6-3e848ec2de4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.393086 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf027694-e689-4cb8-aaf6-3e848ec2de4b" (UID: "bf027694-e689-4cb8-aaf6-3e848ec2de4b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.443313 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.443366 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st4zw\" (UniqueName: \"kubernetes.io/projected/bf027694-e689-4cb8-aaf6-3e848ec2de4b-kube-api-access-st4zw\") on node \"crc\" DevicePath \"\"" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.443386 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf027694-e689-4cb8-aaf6-3e848ec2de4b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.759092 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" event={"ID":"bf027694-e689-4cb8-aaf6-3e848ec2de4b","Type":"ContainerDied","Data":"6d2108900061a436020b1f1321f289871895a69e24685519b3282dc5a28f9b85"} Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.759236 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d2108900061a436020b1f1321f289871895a69e24685519b3282dc5a28f9b85" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.759348 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-64s2r" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.874885 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn"] Feb 27 06:43:48 crc kubenswrapper[4725]: E0227 06:43:48.875980 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf027694-e689-4cb8-aaf6-3e848ec2de4b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.876020 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf027694-e689-4cb8-aaf6-3e848ec2de4b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.884258 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf027694-e689-4cb8-aaf6-3e848ec2de4b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.885458 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.889307 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.889513 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.889727 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.893828 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.905401 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn"] Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.955499 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.955703 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwmst\" (UniqueName: \"kubernetes.io/projected/8b00cf98-bb69-4c5e-8f34-e862f1acf329-kube-api-access-zwmst\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:48 crc kubenswrapper[4725]: I0227 06:43:48.955805 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.058375 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.058508 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.058584 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwmst\" (UniqueName: \"kubernetes.io/projected/8b00cf98-bb69-4c5e-8f34-e862f1acf329-kube-api-access-zwmst\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.068971 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.081930 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.093274 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwmst\" (UniqueName: \"kubernetes.io/projected/8b00cf98-bb69-4c5e-8f34-e862f1acf329-kube-api-access-zwmst\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.211079 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.743966 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn"] Feb 27 06:43:49 crc kubenswrapper[4725]: I0227 06:43:49.768526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" event={"ID":"8b00cf98-bb69-4c5e-8f34-e862f1acf329","Type":"ContainerStarted","Data":"db87f18cfdc6baccbcc8ccf625475b3246456d5460c40190d558ad5dc16bda68"} Feb 27 06:43:50 crc kubenswrapper[4725]: I0227 06:43:50.779762 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" event={"ID":"8b00cf98-bb69-4c5e-8f34-e862f1acf329","Type":"ContainerStarted","Data":"17961e0a4acd69cb32158cc39c1fd6b46c9ddbd04c2e4de239f492cdaae9660d"} Feb 27 06:43:50 crc kubenswrapper[4725]: I0227 06:43:50.802547 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" podStartSLOduration=2.195480329 podStartE2EDuration="2.802530157s" podCreationTimestamp="2026-02-27 06:43:48 +0000 UTC" firstStartedPulling="2026-02-27 06:43:49.751711123 +0000 UTC m=+2008.214331712" lastFinishedPulling="2026-02-27 06:43:50.358760961 +0000 UTC m=+2008.821381540" observedRunningTime="2026-02-27 06:43:50.800566831 +0000 UTC m=+2009.263187400" watchObservedRunningTime="2026-02-27 06:43:50.802530157 +0000 UTC m=+2009.265150726" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.132496 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536244-9cfm7"] Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.134232 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536244-9cfm7" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.137991 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.138614 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.138932 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.163542 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536244-9cfm7"] Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.220315 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnnp\" (UniqueName: \"kubernetes.io/projected/e5de01ce-d531-45f2-bf05-f66ada293780-kube-api-access-8cnnp\") pod \"auto-csr-approver-29536244-9cfm7\" (UID: \"e5de01ce-d531-45f2-bf05-f66ada293780\") " pod="openshift-infra/auto-csr-approver-29536244-9cfm7" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.322654 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnnp\" (UniqueName: \"kubernetes.io/projected/e5de01ce-d531-45f2-bf05-f66ada293780-kube-api-access-8cnnp\") pod \"auto-csr-approver-29536244-9cfm7\" (UID: \"e5de01ce-d531-45f2-bf05-f66ada293780\") " pod="openshift-infra/auto-csr-approver-29536244-9cfm7" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.349448 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnnp\" (UniqueName: \"kubernetes.io/projected/e5de01ce-d531-45f2-bf05-f66ada293780-kube-api-access-8cnnp\") pod \"auto-csr-approver-29536244-9cfm7\" (UID: \"e5de01ce-d531-45f2-bf05-f66ada293780\") " pod="openshift-infra/auto-csr-approver-29536244-9cfm7" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.452928 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536244-9cfm7" Feb 27 06:44:00 crc kubenswrapper[4725]: I0227 06:44:00.919758 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536244-9cfm7"] Feb 27 06:44:01 crc kubenswrapper[4725]: I0227 06:44:01.920509 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536244-9cfm7" event={"ID":"e5de01ce-d531-45f2-bf05-f66ada293780","Type":"ContainerStarted","Data":"c425a9aa705bd54009570aedff6f9e4b18c733e1b0422ab42faad1771e8b2d86"} Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.553994 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.554392 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.763965 4725 scope.go:117] "RemoveContainer" containerID="f3190fa7ccaffee1ae3d41ca52006b2c0f6abfb2097c73f9d59785e19ceb8492" Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.796912 4725 scope.go:117] "RemoveContainer" containerID="5bf67bb66aa34941d965e0f91231a40281eeb4649cb3a35ac6075aa18783d165" Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.851875 4725 scope.go:117] "RemoveContainer" containerID="a99d01d7467652bb4c93ac11f0301ed5b38259cc6f040509661a9a6e0d7efcaf" Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.885034 4725 scope.go:117] "RemoveContainer" containerID="4290a0143f88e4f6b0dbfe4ef20b3bb7f79205484d3e75cf06918865c91a35cd" Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.939748 4725 scope.go:117] "RemoveContainer" containerID="50dac54b18dbafeee33efb59de4343536383f956d1774fe4d0956c2c621b9884" Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.942344 4725 generic.go:334] "Generic (PLEG): container finished" podID="e5de01ce-d531-45f2-bf05-f66ada293780" containerID="874a410fe2e7ab449b90256e3396e21a9592feb7a40d64cbf9622d1eb8897daa" exitCode=0 Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.942795 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536244-9cfm7" event={"ID":"e5de01ce-d531-45f2-bf05-f66ada293780","Type":"ContainerDied","Data":"874a410fe2e7ab449b90256e3396e21a9592feb7a40d64cbf9622d1eb8897daa"} Feb 27 06:44:02 crc kubenswrapper[4725]: I0227 06:44:02.984559 4725 scope.go:117] "RemoveContainer" containerID="03649b7b355ab0d963f6f48aeec023aa86a68a2d053be838270217a1e9d76cf3" Feb 27 06:44:03 crc kubenswrapper[4725]: I0227 06:44:03.032928 4725 scope.go:117] "RemoveContainer" containerID="b909b7e49a73595ec27c5d1a653740fb84a267c3e44c0b7eb176ff7cb3088e70" Feb 27 06:44:03 crc kubenswrapper[4725]: I0227 06:44:03.054593 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-g7bnn"] Feb 27 06:44:03 crc kubenswrapper[4725]: I0227 06:44:03.067484 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-g7bnn"] Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.262360 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c41298-c7b2-4ea5-b628-ac1149cfa7da" path="/var/lib/kubelet/pods/29c41298-c7b2-4ea5-b628-ac1149cfa7da/volumes" Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.354247 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536244-9cfm7" Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.514754 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnnp\" (UniqueName: \"kubernetes.io/projected/e5de01ce-d531-45f2-bf05-f66ada293780-kube-api-access-8cnnp\") pod \"e5de01ce-d531-45f2-bf05-f66ada293780\" (UID: \"e5de01ce-d531-45f2-bf05-f66ada293780\") " Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.522048 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5de01ce-d531-45f2-bf05-f66ada293780-kube-api-access-8cnnp" (OuterVolumeSpecName: "kube-api-access-8cnnp") pod "e5de01ce-d531-45f2-bf05-f66ada293780" (UID: "e5de01ce-d531-45f2-bf05-f66ada293780"). InnerVolumeSpecName "kube-api-access-8cnnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.618148 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cnnp\" (UniqueName: \"kubernetes.io/projected/e5de01ce-d531-45f2-bf05-f66ada293780-kube-api-access-8cnnp\") on node \"crc\" DevicePath \"\"" Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.964892 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536244-9cfm7" event={"ID":"e5de01ce-d531-45f2-bf05-f66ada293780","Type":"ContainerDied","Data":"c425a9aa705bd54009570aedff6f9e4b18c733e1b0422ab42faad1771e8b2d86"} Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.965276 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c425a9aa705bd54009570aedff6f9e4b18c733e1b0422ab42faad1771e8b2d86" Feb 27 06:44:04 crc kubenswrapper[4725]: I0227 06:44:04.965001 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536244-9cfm7" Feb 27 06:44:05 crc kubenswrapper[4725]: I0227 06:44:05.425620 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536238-p95pp"] Feb 27 06:44:05 crc kubenswrapper[4725]: I0227 06:44:05.438079 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536238-p95pp"] Feb 27 06:44:06 crc kubenswrapper[4725]: I0227 06:44:06.046618 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dhc4"] Feb 27 06:44:06 crc kubenswrapper[4725]: I0227 06:44:06.059125 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dhc4"] Feb 27 06:44:06 crc kubenswrapper[4725]: I0227 06:44:06.275906 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01523399-3075-45db-8d71-e7adcd7c6e5a" path="/var/lib/kubelet/pods/01523399-3075-45db-8d71-e7adcd7c6e5a/volumes" Feb 27 06:44:06 crc kubenswrapper[4725]: I0227 06:44:06.277320 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6a75bb-19e2-4fa9-894c-a89271aa9c50" path="/var/lib/kubelet/pods/fd6a75bb-19e2-4fa9-894c-a89271aa9c50/volumes" Feb 27 06:44:32 crc kubenswrapper[4725]: I0227 06:44:32.554949 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:44:32 crc kubenswrapper[4725]: I0227 06:44:32.555476 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:44:41 crc kubenswrapper[4725]: I0227 06:44:41.364933 4725 generic.go:334] "Generic (PLEG): container finished" podID="8b00cf98-bb69-4c5e-8f34-e862f1acf329" containerID="17961e0a4acd69cb32158cc39c1fd6b46c9ddbd04c2e4de239f492cdaae9660d" exitCode=0 Feb 27 06:44:41 crc kubenswrapper[4725]: I0227 06:44:41.365023 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" event={"ID":"8b00cf98-bb69-4c5e-8f34-e862f1acf329","Type":"ContainerDied","Data":"17961e0a4acd69cb32158cc39c1fd6b46c9ddbd04c2e4de239f492cdaae9660d"} Feb 27 06:44:42 crc kubenswrapper[4725]: I0227 06:44:42.936846 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.012433 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-ssh-key-openstack-edpm-ipam\") pod \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.012611 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwmst\" (UniqueName: \"kubernetes.io/projected/8b00cf98-bb69-4c5e-8f34-e862f1acf329-kube-api-access-zwmst\") pod \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.012713 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-inventory\") pod \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\" (UID: \"8b00cf98-bb69-4c5e-8f34-e862f1acf329\") " Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.018077 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b00cf98-bb69-4c5e-8f34-e862f1acf329-kube-api-access-zwmst" (OuterVolumeSpecName: "kube-api-access-zwmst") pod "8b00cf98-bb69-4c5e-8f34-e862f1acf329" (UID: "8b00cf98-bb69-4c5e-8f34-e862f1acf329"). InnerVolumeSpecName "kube-api-access-zwmst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.038387 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-inventory" (OuterVolumeSpecName: "inventory") pod "8b00cf98-bb69-4c5e-8f34-e862f1acf329" (UID: "8b00cf98-bb69-4c5e-8f34-e862f1acf329"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.067310 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8b00cf98-bb69-4c5e-8f34-e862f1acf329" (UID: "8b00cf98-bb69-4c5e-8f34-e862f1acf329"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.115865 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwmst\" (UniqueName: \"kubernetes.io/projected/8b00cf98-bb69-4c5e-8f34-e862f1acf329-kube-api-access-zwmst\") on node \"crc\" DevicePath \"\"" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.115933 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.115956 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b00cf98-bb69-4c5e-8f34-e862f1acf329-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.396021 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" event={"ID":"8b00cf98-bb69-4c5e-8f34-e862f1acf329","Type":"ContainerDied","Data":"db87f18cfdc6baccbcc8ccf625475b3246456d5460c40190d558ad5dc16bda68"} Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.396085 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db87f18cfdc6baccbcc8ccf625475b3246456d5460c40190d558ad5dc16bda68" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.396188 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.558369 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ctvww"] Feb 27 06:44:43 crc kubenswrapper[4725]: E0227 06:44:43.561771 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b00cf98-bb69-4c5e-8f34-e862f1acf329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.561809 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b00cf98-bb69-4c5e-8f34-e862f1acf329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:44:43 crc kubenswrapper[4725]: E0227 06:44:43.561857 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5de01ce-d531-45f2-bf05-f66ada293780" containerName="oc" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.561871 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5de01ce-d531-45f2-bf05-f66ada293780" containerName="oc" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.562212 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5de01ce-d531-45f2-bf05-f66ada293780" containerName="oc" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.562254 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b00cf98-bb69-4c5e-8f34-e862f1acf329" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.563281 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.566560 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ctvww"] Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.576835 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.577112 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.577257 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.577544 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.729317 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcx6v\" (UniqueName: \"kubernetes.io/projected/22798438-191d-4ecf-ab5d-23af37e208b3-kube-api-access-hcx6v\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.729514 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.729875 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.832216 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.832281 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcx6v\" (UniqueName: \"kubernetes.io/projected/22798438-191d-4ecf-ab5d-23af37e208b3-kube-api-access-hcx6v\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.832365 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.837734 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.837947 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.877737 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcx6v\" (UniqueName: \"kubernetes.io/projected/22798438-191d-4ecf-ab5d-23af37e208b3-kube-api-access-hcx6v\") pod \"ssh-known-hosts-edpm-deployment-ctvww\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:43 crc kubenswrapper[4725]: I0227 06:44:43.890978 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:44 crc kubenswrapper[4725]: I0227 06:44:44.465246 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ctvww"] Feb 27 06:44:45 crc kubenswrapper[4725]: I0227 06:44:45.419940 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" event={"ID":"22798438-191d-4ecf-ab5d-23af37e208b3","Type":"ContainerStarted","Data":"da1b655b5f5675a9b86e85068d9b2d1384ead2c166f58ff814e9c744216efd0a"} Feb 27 06:44:46 crc kubenswrapper[4725]: I0227 06:44:46.433991 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" event={"ID":"22798438-191d-4ecf-ab5d-23af37e208b3","Type":"ContainerStarted","Data":"ca87da942c091b52c8ff1da935df6bf54ee52fa090a8e79df70f80ed5791ed2b"} Feb 27 06:44:46 crc kubenswrapper[4725]: I0227 06:44:46.461462 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" podStartSLOduration=2.741769268 podStartE2EDuration="3.461436647s" podCreationTimestamp="2026-02-27 06:44:43 +0000 UTC" firstStartedPulling="2026-02-27 06:44:44.470188449 +0000 UTC m=+2062.932809058" lastFinishedPulling="2026-02-27 06:44:45.189855868 +0000 UTC m=+2063.652476437" observedRunningTime="2026-02-27 06:44:46.451193816 +0000 UTC m=+2064.913814415" watchObservedRunningTime="2026-02-27 06:44:46.461436647 +0000 UTC m=+2064.924057216" Feb 27 06:44:49 crc kubenswrapper[4725]: I0227 06:44:49.037253 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wzwpl"] Feb 27 06:44:49 crc kubenswrapper[4725]: I0227 06:44:49.045097 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wzwpl"] Feb 27 06:44:50 crc kubenswrapper[4725]: I0227 06:44:50.269720 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64cdb049-7b8a-4c79-8c77-678172f96778" path="/var/lib/kubelet/pods/64cdb049-7b8a-4c79-8c77-678172f96778/volumes" Feb 27 06:44:53 crc kubenswrapper[4725]: I0227 06:44:53.520940 4725 generic.go:334] "Generic (PLEG): container finished" podID="22798438-191d-4ecf-ab5d-23af37e208b3" containerID="ca87da942c091b52c8ff1da935df6bf54ee52fa090a8e79df70f80ed5791ed2b" exitCode=0 Feb 27 06:44:53 crc kubenswrapper[4725]: I0227 06:44:53.521038 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" event={"ID":"22798438-191d-4ecf-ab5d-23af37e208b3","Type":"ContainerDied","Data":"ca87da942c091b52c8ff1da935df6bf54ee52fa090a8e79df70f80ed5791ed2b"} Feb 27 06:44:54 crc kubenswrapper[4725]: I0227 06:44:54.958822 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.078109 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-inventory-0\") pod \"22798438-191d-4ecf-ab5d-23af37e208b3\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.078253 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcx6v\" (UniqueName: \"kubernetes.io/projected/22798438-191d-4ecf-ab5d-23af37e208b3-kube-api-access-hcx6v\") pod \"22798438-191d-4ecf-ab5d-23af37e208b3\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.078454 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-ssh-key-openstack-edpm-ipam\") pod \"22798438-191d-4ecf-ab5d-23af37e208b3\" (UID: \"22798438-191d-4ecf-ab5d-23af37e208b3\") " Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.084446 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22798438-191d-4ecf-ab5d-23af37e208b3-kube-api-access-hcx6v" (OuterVolumeSpecName: "kube-api-access-hcx6v") pod "22798438-191d-4ecf-ab5d-23af37e208b3" (UID: "22798438-191d-4ecf-ab5d-23af37e208b3"). InnerVolumeSpecName "kube-api-access-hcx6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.105937 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "22798438-191d-4ecf-ab5d-23af37e208b3" (UID: "22798438-191d-4ecf-ab5d-23af37e208b3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.120606 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "22798438-191d-4ecf-ab5d-23af37e208b3" (UID: "22798438-191d-4ecf-ab5d-23af37e208b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.181238 4725 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.181304 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcx6v\" (UniqueName: \"kubernetes.io/projected/22798438-191d-4ecf-ab5d-23af37e208b3-kube-api-access-hcx6v\") on node \"crc\" DevicePath \"\"" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.181327 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22798438-191d-4ecf-ab5d-23af37e208b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.542516 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" event={"ID":"22798438-191d-4ecf-ab5d-23af37e208b3","Type":"ContainerDied","Data":"da1b655b5f5675a9b86e85068d9b2d1384ead2c166f58ff814e9c744216efd0a"} Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.542566 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1b655b5f5675a9b86e85068d9b2d1384ead2c166f58ff814e9c744216efd0a" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.542589 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ctvww" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.650486 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm"] Feb 27 06:44:55 crc kubenswrapper[4725]: E0227 06:44:55.650903 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22798438-191d-4ecf-ab5d-23af37e208b3" containerName="ssh-known-hosts-edpm-deployment" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.650925 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="22798438-191d-4ecf-ab5d-23af37e208b3" containerName="ssh-known-hosts-edpm-deployment" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.651180 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="22798438-191d-4ecf-ab5d-23af37e208b3" containerName="ssh-known-hosts-edpm-deployment" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.651888 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.655244 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.656620 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.656711 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.656622 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.676196 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm"] Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.692641 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.692831 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.692956 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlpm\" (UniqueName: \"kubernetes.io/projected/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-kube-api-access-qrlpm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.795231 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.795442 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.795495 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlpm\" (UniqueName: \"kubernetes.io/projected/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-kube-api-access-qrlpm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.800323 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.802812 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.815842 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlpm\" (UniqueName: \"kubernetes.io/projected/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-kube-api-access-qrlpm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55xwm\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:55 crc kubenswrapper[4725]: I0227 06:44:55.974196 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:44:56 crc kubenswrapper[4725]: I0227 06:44:56.611574 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm"] Feb 27 06:44:57 crc kubenswrapper[4725]: I0227 06:44:57.570011 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" event={"ID":"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6","Type":"ContainerStarted","Data":"b3af6e930a2392a4aa90886fc9ec65dd6f82edb74e9061ed7085c245890dbd68"} Feb 27 06:44:57 crc kubenswrapper[4725]: I0227 06:44:57.570273 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" event={"ID":"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6","Type":"ContainerStarted","Data":"0d0809e2e206e62ee408046649739658afdeee8d12199f04e375766eb326942a"} Feb 27 06:44:57 crc kubenswrapper[4725]: I0227 06:44:57.630057 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" podStartSLOduration=2.23130694 podStartE2EDuration="2.630033466s" podCreationTimestamp="2026-02-27 06:44:55 +0000 UTC" firstStartedPulling="2026-02-27 06:44:56.619164588 +0000 UTC m=+2075.081785167" lastFinishedPulling="2026-02-27 06:44:57.017891084 +0000 UTC m=+2075.480511693" observedRunningTime="2026-02-27 06:44:57.612033424 +0000 UTC m=+2076.074654013" watchObservedRunningTime="2026-02-27 06:44:57.630033466 +0000 UTC m=+2076.092654045" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.314187 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7hmpx"] Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.317039 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.327025 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hmpx"] Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.355400 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-catalog-content\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.355935 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-utilities\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.356726 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whswl\" (UniqueName: \"kubernetes.io/projected/dbc7a2f1-caac-4816-b945-64718d6e5997-kube-api-access-whswl\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.458912 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-catalog-content\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.459110 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-utilities\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.459199 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whswl\" (UniqueName: \"kubernetes.io/projected/dbc7a2f1-caac-4816-b945-64718d6e5997-kube-api-access-whswl\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.459491 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-utilities\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.459697 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-catalog-content\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.481894 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whswl\" (UniqueName: \"kubernetes.io/projected/dbc7a2f1-caac-4816-b945-64718d6e5997-kube-api-access-whswl\") pod \"redhat-marketplace-7hmpx\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:58 crc kubenswrapper[4725]: I0227 06:44:58.651345 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:44:59 crc kubenswrapper[4725]: W0227 06:44:59.172445 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbc7a2f1_caac_4816_b945_64718d6e5997.slice/crio-b48f25d3897b6ba458542d10336fbe88c2060966d7cc315055cc007fda01fc2d WatchSource:0}: Error finding container b48f25d3897b6ba458542d10336fbe88c2060966d7cc315055cc007fda01fc2d: Status 404 returned error can't find the container with id b48f25d3897b6ba458542d10336fbe88c2060966d7cc315055cc007fda01fc2d Feb 27 06:44:59 crc kubenswrapper[4725]: I0227 06:44:59.181989 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hmpx"] Feb 27 06:44:59 crc kubenswrapper[4725]: I0227 06:44:59.595607 4725 generic.go:334] "Generic (PLEG): container finished" podID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerID="dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13" exitCode=0 Feb 27 06:44:59 crc kubenswrapper[4725]: I0227 06:44:59.595672 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hmpx" event={"ID":"dbc7a2f1-caac-4816-b945-64718d6e5997","Type":"ContainerDied","Data":"dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13"} Feb 27 06:44:59 crc kubenswrapper[4725]: I0227 06:44:59.595735 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hmpx" event={"ID":"dbc7a2f1-caac-4816-b945-64718d6e5997","Type":"ContainerStarted","Data":"b48f25d3897b6ba458542d10336fbe88c2060966d7cc315055cc007fda01fc2d"} Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.153514 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz"] Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.155037 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.157444 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.157578 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.182772 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz"] Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.194108 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a61a62-28d7-42be-b58b-1d98821caefb-secret-volume\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.194218 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a61a62-28d7-42be-b58b-1d98821caefb-config-volume\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.194269 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5j9h\" (UniqueName: \"kubernetes.io/projected/a3a61a62-28d7-42be-b58b-1d98821caefb-kube-api-access-c5j9h\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.296064 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5j9h\" (UniqueName: \"kubernetes.io/projected/a3a61a62-28d7-42be-b58b-1d98821caefb-kube-api-access-c5j9h\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.296129 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a61a62-28d7-42be-b58b-1d98821caefb-config-volume\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.296419 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a61a62-28d7-42be-b58b-1d98821caefb-secret-volume\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.297683 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a61a62-28d7-42be-b58b-1d98821caefb-config-volume\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.325442 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5j9h\" (UniqueName: \"kubernetes.io/projected/a3a61a62-28d7-42be-b58b-1d98821caefb-kube-api-access-c5j9h\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.325664 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a61a62-28d7-42be-b58b-1d98821caefb-secret-volume\") pod \"collect-profiles-29536245-b96nz\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.479907 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:00 crc kubenswrapper[4725]: I0227 06:45:00.934021 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz"] Feb 27 06:45:00 crc kubenswrapper[4725]: W0227 06:45:00.942263 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a61a62_28d7_42be_b58b_1d98821caefb.slice/crio-001fb63e2b92b66af0ece0353ba6af73c16bc64419ff2d439da86ceb16845ad2 WatchSource:0}: Error finding container 001fb63e2b92b66af0ece0353ba6af73c16bc64419ff2d439da86ceb16845ad2: Status 404 returned error can't find the container with id 001fb63e2b92b66af0ece0353ba6af73c16bc64419ff2d439da86ceb16845ad2 Feb 27 06:45:01 crc kubenswrapper[4725]: I0227 06:45:01.634765 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" event={"ID":"a3a61a62-28d7-42be-b58b-1d98821caefb","Type":"ContainerStarted","Data":"f5dbf9bd4c75f25e78483cf73a97a75a735db58753202f06a2085e5990b85ba0"} Feb 27 06:45:01 crc kubenswrapper[4725]: I0227 06:45:01.635031 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" event={"ID":"a3a61a62-28d7-42be-b58b-1d98821caefb","Type":"ContainerStarted","Data":"001fb63e2b92b66af0ece0353ba6af73c16bc64419ff2d439da86ceb16845ad2"} Feb 27 06:45:01 crc kubenswrapper[4725]: I0227 06:45:01.649120 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hmpx" event={"ID":"dbc7a2f1-caac-4816-b945-64718d6e5997","Type":"ContainerStarted","Data":"b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899"} Feb 27 06:45:01 crc kubenswrapper[4725]: I0227 06:45:01.669226 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" podStartSLOduration=1.6691991750000001 podStartE2EDuration="1.669199175s" podCreationTimestamp="2026-02-27 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:45:01.66866921 +0000 UTC m=+2080.131289789" watchObservedRunningTime="2026-02-27 06:45:01.669199175 +0000 UTC m=+2080.131819744" Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.554850 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.555196 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.555243 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.555920 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b501af946b1ad795a1755f721e8e4751f676f233bfcb14cf18cba37c4f4ffaf"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.555988 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://0b501af946b1ad795a1755f721e8e4751f676f233bfcb14cf18cba37c4f4ffaf" gracePeriod=600 Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.660899 4725 generic.go:334] "Generic (PLEG): container finished" podID="a3a61a62-28d7-42be-b58b-1d98821caefb" containerID="f5dbf9bd4c75f25e78483cf73a97a75a735db58753202f06a2085e5990b85ba0" exitCode=0 Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.660980 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" event={"ID":"a3a61a62-28d7-42be-b58b-1d98821caefb","Type":"ContainerDied","Data":"f5dbf9bd4c75f25e78483cf73a97a75a735db58753202f06a2085e5990b85ba0"} Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.663209 4725 generic.go:334] "Generic (PLEG): container finished" podID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerID="b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899" exitCode=0 Feb 27 06:45:02 crc kubenswrapper[4725]: I0227 06:45:02.663264 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hmpx" event={"ID":"dbc7a2f1-caac-4816-b945-64718d6e5997","Type":"ContainerDied","Data":"b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899"} Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.232925 4725 scope.go:117] "RemoveContainer" containerID="334dee8c7e9b066f350f06f9e395615502274e2babcb569fd49c027877e81378" Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.282024 4725 scope.go:117] "RemoveContainer" containerID="07f6abf4996093c8ae50443c097e75baa035616a4ebcfb705657f9176d11d77b" Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.319685 4725 scope.go:117] "RemoveContainer" containerID="3541b814b0fae0eb3a9e991dda2a227e7421c9d59d42990f2b1a6f091b0203ab" Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.382824 4725 scope.go:117] "RemoveContainer" containerID="655cebfcb00fd50cafc68d9c30ef5633763964a6ccade3ab5376920ab1631492" Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.674030 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="0b501af946b1ad795a1755f721e8e4751f676f233bfcb14cf18cba37c4f4ffaf" exitCode=0 Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.674062 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"0b501af946b1ad795a1755f721e8e4751f676f233bfcb14cf18cba37c4f4ffaf"} Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.674396 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5"} Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.674413 4725 scope.go:117] "RemoveContainer" containerID="c75a6aab3de33adc15eabafdbb08d3cf08fdda7ca737d701af23f2b70d286e1d" Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.681467 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hmpx" event={"ID":"dbc7a2f1-caac-4816-b945-64718d6e5997","Type":"ContainerStarted","Data":"a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d"} Feb 27 06:45:03 crc kubenswrapper[4725]: I0227 06:45:03.717830 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7hmpx" podStartSLOduration=2.186117112 podStartE2EDuration="5.717814974s" podCreationTimestamp="2026-02-27 06:44:58 +0000 UTC" firstStartedPulling="2026-02-27 06:44:59.598215869 +0000 UTC m=+2078.060836438" lastFinishedPulling="2026-02-27 06:45:03.129913731 +0000 UTC m=+2081.592534300" observedRunningTime="2026-02-27 06:45:03.713270945 +0000 UTC m=+2082.175891514" watchObservedRunningTime="2026-02-27 06:45:03.717814974 +0000 UTC m=+2082.180435533" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.046796 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.177587 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a61a62-28d7-42be-b58b-1d98821caefb-config-volume\") pod \"a3a61a62-28d7-42be-b58b-1d98821caefb\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.177795 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a61a62-28d7-42be-b58b-1d98821caefb-secret-volume\") pod \"a3a61a62-28d7-42be-b58b-1d98821caefb\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.177879 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5j9h\" (UniqueName: \"kubernetes.io/projected/a3a61a62-28d7-42be-b58b-1d98821caefb-kube-api-access-c5j9h\") pod \"a3a61a62-28d7-42be-b58b-1d98821caefb\" (UID: \"a3a61a62-28d7-42be-b58b-1d98821caefb\") " Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.178462 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a61a62-28d7-42be-b58b-1d98821caefb-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3a61a62-28d7-42be-b58b-1d98821caefb" (UID: "a3a61a62-28d7-42be-b58b-1d98821caefb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.186614 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3a61a62-28d7-42be-b58b-1d98821caefb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3a61a62-28d7-42be-b58b-1d98821caefb" (UID: "a3a61a62-28d7-42be-b58b-1d98821caefb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.186715 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a61a62-28d7-42be-b58b-1d98821caefb-kube-api-access-c5j9h" (OuterVolumeSpecName: "kube-api-access-c5j9h") pod "a3a61a62-28d7-42be-b58b-1d98821caefb" (UID: "a3a61a62-28d7-42be-b58b-1d98821caefb"). InnerVolumeSpecName "kube-api-access-c5j9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.279997 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a61a62-28d7-42be-b58b-1d98821caefb-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.280031 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5j9h\" (UniqueName: \"kubernetes.io/projected/a3a61a62-28d7-42be-b58b-1d98821caefb-kube-api-access-c5j9h\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.280039 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a61a62-28d7-42be-b58b-1d98821caefb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.696980 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.696981 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz" event={"ID":"a3a61a62-28d7-42be-b58b-1d98821caefb","Type":"ContainerDied","Data":"001fb63e2b92b66af0ece0353ba6af73c16bc64419ff2d439da86ceb16845ad2"} Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.697482 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="001fb63e2b92b66af0ece0353ba6af73c16bc64419ff2d439da86ceb16845ad2" Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.748772 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp"] Feb 27 06:45:04 crc kubenswrapper[4725]: I0227 06:45:04.769812 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536200-5ttzp"] Feb 27 06:45:05 crc kubenswrapper[4725]: I0227 06:45:05.710116 4725 generic.go:334] "Generic (PLEG): container finished" podID="8c8e8aea-4c46-4fe2-844f-2c51d7662fa6" containerID="b3af6e930a2392a4aa90886fc9ec65dd6f82edb74e9061ed7085c245890dbd68" exitCode=0 Feb 27 06:45:05 crc kubenswrapper[4725]: I0227 06:45:05.710166 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" event={"ID":"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6","Type":"ContainerDied","Data":"b3af6e930a2392a4aa90886fc9ec65dd6f82edb74e9061ed7085c245890dbd68"} Feb 27 06:45:06 crc kubenswrapper[4725]: I0227 06:45:06.272273 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1908a5-f826-4eae-a6fa-c899dda28b57" path="/var/lib/kubelet/pods/5d1908a5-f826-4eae-a6fa-c899dda28b57/volumes" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.155013 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.239544 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-ssh-key-openstack-edpm-ipam\") pod \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.239812 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-inventory\") pod \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.239936 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlpm\" (UniqueName: \"kubernetes.io/projected/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-kube-api-access-qrlpm\") pod \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\" (UID: \"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6\") " Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.254512 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-kube-api-access-qrlpm" (OuterVolumeSpecName: "kube-api-access-qrlpm") pod "8c8e8aea-4c46-4fe2-844f-2c51d7662fa6" (UID: "8c8e8aea-4c46-4fe2-844f-2c51d7662fa6"). InnerVolumeSpecName "kube-api-access-qrlpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.276518 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-inventory" (OuterVolumeSpecName: "inventory") pod "8c8e8aea-4c46-4fe2-844f-2c51d7662fa6" (UID: "8c8e8aea-4c46-4fe2-844f-2c51d7662fa6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.279082 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c8e8aea-4c46-4fe2-844f-2c51d7662fa6" (UID: "8c8e8aea-4c46-4fe2-844f-2c51d7662fa6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.343411 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.343452 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.343465 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrlpm\" (UniqueName: \"kubernetes.io/projected/8c8e8aea-4c46-4fe2-844f-2c51d7662fa6-kube-api-access-qrlpm\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.738795 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" event={"ID":"8c8e8aea-4c46-4fe2-844f-2c51d7662fa6","Type":"ContainerDied","Data":"0d0809e2e206e62ee408046649739658afdeee8d12199f04e375766eb326942a"} Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.739170 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d0809e2e206e62ee408046649739658afdeee8d12199f04e375766eb326942a" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.738888 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55xwm" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.844462 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77"] Feb 27 06:45:07 crc kubenswrapper[4725]: E0227 06:45:07.845095 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8e8aea-4c46-4fe2-844f-2c51d7662fa6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.845122 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8e8aea-4c46-4fe2-844f-2c51d7662fa6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:45:07 crc kubenswrapper[4725]: E0227 06:45:07.845150 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a61a62-28d7-42be-b58b-1d98821caefb" containerName="collect-profiles" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.845158 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a61a62-28d7-42be-b58b-1d98821caefb" containerName="collect-profiles" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.845420 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8e8aea-4c46-4fe2-844f-2c51d7662fa6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.845453 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a61a62-28d7-42be-b58b-1d98821caefb" containerName="collect-profiles" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.846345 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.848838 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.849114 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.849277 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.849541 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.872526 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77"] Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.960759 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndbg\" (UniqueName: \"kubernetes.io/projected/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-kube-api-access-lndbg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.960859 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:07 crc kubenswrapper[4725]: I0227 06:45:07.961199 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.062593 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndbg\" (UniqueName: \"kubernetes.io/projected/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-kube-api-access-lndbg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.062672 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.062774 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.068509 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.073820 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.082241 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndbg\" (UniqueName: \"kubernetes.io/projected/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-kube-api-access-lndbg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.170462 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.651781 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.652282 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.748372 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.836647 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:45:08 crc kubenswrapper[4725]: I0227 06:45:08.863394 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77"] Feb 27 06:45:09 crc kubenswrapper[4725]: I0227 06:45:09.762107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" event={"ID":"9ac46847-17bf-49e5-ae76-1ea3af18c9f5","Type":"ContainerStarted","Data":"7f9f60fad71565bc3f620c6ea4468af7d8fa5c2b2778113fdc6f38f998a5fc94"} Feb 27 06:45:09 crc kubenswrapper[4725]: I0227 06:45:09.762608 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" event={"ID":"9ac46847-17bf-49e5-ae76-1ea3af18c9f5","Type":"ContainerStarted","Data":"1a3b137e644a462d3f8f9d50dc4021251105465911f7f6003a1dbd75be7274b4"} Feb 27 06:45:09 crc kubenswrapper[4725]: I0227 06:45:09.785921 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" podStartSLOduration=2.361326642 podStartE2EDuration="2.785903472s" podCreationTimestamp="2026-02-27 06:45:07 +0000 UTC" firstStartedPulling="2026-02-27 06:45:08.880976816 +0000 UTC m=+2087.343597425" lastFinishedPulling="2026-02-27 06:45:09.305553676 +0000 UTC m=+2087.768174255" observedRunningTime="2026-02-27 06:45:09.776779653 +0000 UTC m=+2088.239400222" watchObservedRunningTime="2026-02-27 06:45:09.785903472 +0000 UTC m=+2088.248524041" Feb 27 06:45:11 crc kubenswrapper[4725]: I0227 06:45:11.686115 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hmpx"] Feb 27 06:45:11 crc kubenswrapper[4725]: I0227 06:45:11.686995 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7hmpx" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="registry-server" containerID="cri-o://a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d" gracePeriod=2 Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.211616 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.357758 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-catalog-content\") pod \"dbc7a2f1-caac-4816-b945-64718d6e5997\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.358194 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-utilities\") pod \"dbc7a2f1-caac-4816-b945-64718d6e5997\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.358392 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whswl\" (UniqueName: \"kubernetes.io/projected/dbc7a2f1-caac-4816-b945-64718d6e5997-kube-api-access-whswl\") pod \"dbc7a2f1-caac-4816-b945-64718d6e5997\" (UID: \"dbc7a2f1-caac-4816-b945-64718d6e5997\") " Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.358981 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-utilities" (OuterVolumeSpecName: "utilities") pod "dbc7a2f1-caac-4816-b945-64718d6e5997" (UID: "dbc7a2f1-caac-4816-b945-64718d6e5997"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.364707 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc7a2f1-caac-4816-b945-64718d6e5997-kube-api-access-whswl" (OuterVolumeSpecName: "kube-api-access-whswl") pod "dbc7a2f1-caac-4816-b945-64718d6e5997" (UID: "dbc7a2f1-caac-4816-b945-64718d6e5997"). InnerVolumeSpecName "kube-api-access-whswl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.405273 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbc7a2f1-caac-4816-b945-64718d6e5997" (UID: "dbc7a2f1-caac-4816-b945-64718d6e5997"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.461675 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whswl\" (UniqueName: \"kubernetes.io/projected/dbc7a2f1-caac-4816-b945-64718d6e5997-kube-api-access-whswl\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.461706 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.461719 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc7a2f1-caac-4816-b945-64718d6e5997-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.800736 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hmpx" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.800760 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hmpx" event={"ID":"dbc7a2f1-caac-4816-b945-64718d6e5997","Type":"ContainerDied","Data":"a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d"} Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.800682 4725 generic.go:334] "Generic (PLEG): container finished" podID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerID="a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d" exitCode=0 Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.802801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hmpx" event={"ID":"dbc7a2f1-caac-4816-b945-64718d6e5997","Type":"ContainerDied","Data":"b48f25d3897b6ba458542d10336fbe88c2060966d7cc315055cc007fda01fc2d"} Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.802843 4725 scope.go:117] "RemoveContainer" containerID="a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.843701 4725 scope.go:117] "RemoveContainer" containerID="b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.868176 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hmpx"] Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.881746 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hmpx"] Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.887511 4725 scope.go:117] "RemoveContainer" containerID="dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.931191 4725 scope.go:117] "RemoveContainer" containerID="a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d" Feb 27 06:45:12 crc kubenswrapper[4725]: E0227 06:45:12.972832 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d\": container with ID starting with a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d not found: ID does not exist" containerID="a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.972909 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d"} err="failed to get container status \"a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d\": rpc error: code = NotFound desc = could not find container \"a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d\": container with ID starting with a9927860bfb2e1393a5d4ff1dea887918b195607bf6b9c8a29d79c6cc8b91f3d not found: ID does not exist" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.972950 4725 scope.go:117] "RemoveContainer" containerID="b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899" Feb 27 06:45:12 crc kubenswrapper[4725]: E0227 06:45:12.973491 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899\": container with ID starting with b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899 not found: ID does not exist" containerID="b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.973534 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899"} err="failed to get container status \"b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899\": rpc error: code = NotFound desc = could not find container \"b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899\": container with ID starting with b24aa28bbaa28f0d470fed7db91f53752e21cb26d442c7e0a20e6976da64a899 not found: ID does not exist" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.973567 4725 scope.go:117] "RemoveContainer" containerID="dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13" Feb 27 06:45:12 crc kubenswrapper[4725]: E0227 06:45:12.973916 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13\": container with ID starting with dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13 not found: ID does not exist" containerID="dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13" Feb 27 06:45:12 crc kubenswrapper[4725]: I0227 06:45:12.973970 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13"} err="failed to get container status \"dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13\": rpc error: code = NotFound desc = could not find container \"dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13\": container with ID starting with dd5fd7ce01741a7cf420b2295b552c53ac7e3ff464f8550de9dcc6a89208cb13 not found: ID does not exist" Feb 27 06:45:14 crc kubenswrapper[4725]: I0227 06:45:14.269722 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" path="/var/lib/kubelet/pods/dbc7a2f1-caac-4816-b945-64718d6e5997/volumes" Feb 27 06:45:19 crc kubenswrapper[4725]: I0227 06:45:19.888266 4725 generic.go:334] "Generic (PLEG): container finished" podID="9ac46847-17bf-49e5-ae76-1ea3af18c9f5" containerID="7f9f60fad71565bc3f620c6ea4468af7d8fa5c2b2778113fdc6f38f998a5fc94" exitCode=0 Feb 27 06:45:19 crc kubenswrapper[4725]: I0227 06:45:19.888386 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" event={"ID":"9ac46847-17bf-49e5-ae76-1ea3af18c9f5","Type":"ContainerDied","Data":"7f9f60fad71565bc3f620c6ea4468af7d8fa5c2b2778113fdc6f38f998a5fc94"} Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.418567 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.479624 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-inventory\") pod \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.479832 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-ssh-key-openstack-edpm-ipam\") pod \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.479894 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndbg\" (UniqueName: \"kubernetes.io/projected/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-kube-api-access-lndbg\") pod \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\" (UID: \"9ac46847-17bf-49e5-ae76-1ea3af18c9f5\") " Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.486098 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-kube-api-access-lndbg" (OuterVolumeSpecName: "kube-api-access-lndbg") pod "9ac46847-17bf-49e5-ae76-1ea3af18c9f5" (UID: "9ac46847-17bf-49e5-ae76-1ea3af18c9f5"). InnerVolumeSpecName "kube-api-access-lndbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.523527 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-inventory" (OuterVolumeSpecName: "inventory") pod "9ac46847-17bf-49e5-ae76-1ea3af18c9f5" (UID: "9ac46847-17bf-49e5-ae76-1ea3af18c9f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.529557 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ac46847-17bf-49e5-ae76-1ea3af18c9f5" (UID: "9ac46847-17bf-49e5-ae76-1ea3af18c9f5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.581507 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.581543 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndbg\" (UniqueName: \"kubernetes.io/projected/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-kube-api-access-lndbg\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.581555 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac46847-17bf-49e5-ae76-1ea3af18c9f5-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.914100 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" event={"ID":"9ac46847-17bf-49e5-ae76-1ea3af18c9f5","Type":"ContainerDied","Data":"1a3b137e644a462d3f8f9d50dc4021251105465911f7f6003a1dbd75be7274b4"} Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.914149 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a3b137e644a462d3f8f9d50dc4021251105465911f7f6003a1dbd75be7274b4" Feb 27 06:45:21 crc kubenswrapper[4725]: I0227 06:45:21.914203 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.017976 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n"] Feb 27 06:45:22 crc kubenswrapper[4725]: E0227 06:45:22.018405 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="extract-utilities" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.018426 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="extract-utilities" Feb 27 06:45:22 crc kubenswrapper[4725]: E0227 06:45:22.018446 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="registry-server" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.018452 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="registry-server" Feb 27 06:45:22 crc kubenswrapper[4725]: E0227 06:45:22.018467 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="extract-content" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.018475 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="extract-content" Feb 27 06:45:22 crc kubenswrapper[4725]: E0227 06:45:22.018497 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac46847-17bf-49e5-ae76-1ea3af18c9f5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.018503 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac46847-17bf-49e5-ae76-1ea3af18c9f5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.018685 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc7a2f1-caac-4816-b945-64718d6e5997" containerName="registry-server" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.018710 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac46847-17bf-49e5-ae76-1ea3af18c9f5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.019388 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.021556 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.022530 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.022668 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.022848 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.023068 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.023193 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.023351 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.023462 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.039623 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n"] Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091055 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091101 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091120 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091202 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091223 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091298 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091410 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091443 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091473 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091517 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091553 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091578 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnt6j\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-kube-api-access-nnt6j\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091601 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.091642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.193899 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.193987 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194032 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194101 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194157 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194185 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnt6j\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-kube-api-access-nnt6j\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194221 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194282 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194380 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194419 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194454 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194523 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.194603 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.197317 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.198984 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.199251 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.199598 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.199629 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.199844 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.201676 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.202904 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.203462 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.204016 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.208075 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.208546 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.209330 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.209909 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.210038 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.210527 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.210565 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.210805 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.211873 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.233679 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnt6j\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-kube-api-access-nnt6j\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.348448 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.355956 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:45:22 crc kubenswrapper[4725]: I0227 06:45:22.956943 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n"] Feb 27 06:45:22 crc kubenswrapper[4725]: W0227 06:45:22.963266 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8732fb9d_c8a7_4cb3_acca_83301a2c03dc.slice/crio-302584915a8551157cf09b3f0d59fbe6b16d6fd6879b9969492c8fe756c17fe6 WatchSource:0}: Error finding container 302584915a8551157cf09b3f0d59fbe6b16d6fd6879b9969492c8fe756c17fe6: Status 404 returned error can't find the container with id 302584915a8551157cf09b3f0d59fbe6b16d6fd6879b9969492c8fe756c17fe6 Feb 27 06:45:23 crc kubenswrapper[4725]: I0227 06:45:23.395786 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:45:23 crc kubenswrapper[4725]: I0227 06:45:23.936780 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" event={"ID":"8732fb9d-c8a7-4cb3-acca-83301a2c03dc","Type":"ContainerStarted","Data":"1a480104a58a52f4560b1f7472e6db5fd149d35098acb01dde4704875aeb96ff"} Feb 27 06:45:23 crc kubenswrapper[4725]: I0227 06:45:23.936830 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" event={"ID":"8732fb9d-c8a7-4cb3-acca-83301a2c03dc","Type":"ContainerStarted","Data":"302584915a8551157cf09b3f0d59fbe6b16d6fd6879b9969492c8fe756c17fe6"} Feb 27 06:45:23 crc kubenswrapper[4725]: I0227 06:45:23.958033 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" podStartSLOduration=2.531623345 podStartE2EDuration="2.958012387s" podCreationTimestamp="2026-02-27 06:45:21 +0000 UTC" firstStartedPulling="2026-02-27 06:45:22.9661777 +0000 UTC m=+2101.428798289" lastFinishedPulling="2026-02-27 06:45:23.392566722 +0000 UTC m=+2101.855187331" observedRunningTime="2026-02-27 06:45:23.955477695 +0000 UTC m=+2102.418098284" watchObservedRunningTime="2026-02-27 06:45:23.958012387 +0000 UTC m=+2102.420632966" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.164668 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536246-t6rcx"] Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.166644 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.169846 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.170100 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.170457 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.181226 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536246-t6rcx"] Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.263039 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d57m\" (UniqueName: \"kubernetes.io/projected/4e720dc9-93d2-4b2a-8a84-987f8f987324-kube-api-access-5d57m\") pod \"auto-csr-approver-29536246-t6rcx\" (UID: \"4e720dc9-93d2-4b2a-8a84-987f8f987324\") " pod="openshift-infra/auto-csr-approver-29536246-t6rcx" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.364552 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d57m\" (UniqueName: \"kubernetes.io/projected/4e720dc9-93d2-4b2a-8a84-987f8f987324-kube-api-access-5d57m\") pod \"auto-csr-approver-29536246-t6rcx\" (UID: \"4e720dc9-93d2-4b2a-8a84-987f8f987324\") " pod="openshift-infra/auto-csr-approver-29536246-t6rcx" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.395566 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d57m\" (UniqueName: \"kubernetes.io/projected/4e720dc9-93d2-4b2a-8a84-987f8f987324-kube-api-access-5d57m\") pod \"auto-csr-approver-29536246-t6rcx\" (UID: \"4e720dc9-93d2-4b2a-8a84-987f8f987324\") " pod="openshift-infra/auto-csr-approver-29536246-t6rcx" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.512050 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" Feb 27 06:46:00 crc kubenswrapper[4725]: I0227 06:46:00.959719 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536246-t6rcx"] Feb 27 06:46:00 crc kubenswrapper[4725]: W0227 06:46:00.964083 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e720dc9_93d2_4b2a_8a84_987f8f987324.slice/crio-1273945e276725a90d92d1a9d058aea3a467c987d702a40c8eb1e9b5b1e687a1 WatchSource:0}: Error finding container 1273945e276725a90d92d1a9d058aea3a467c987d702a40c8eb1e9b5b1e687a1: Status 404 returned error can't find the container with id 1273945e276725a90d92d1a9d058aea3a467c987d702a40c8eb1e9b5b1e687a1 Feb 27 06:46:01 crc kubenswrapper[4725]: I0227 06:46:01.377502 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" event={"ID":"4e720dc9-93d2-4b2a-8a84-987f8f987324","Type":"ContainerStarted","Data":"1273945e276725a90d92d1a9d058aea3a467c987d702a40c8eb1e9b5b1e687a1"} Feb 27 06:46:02 crc kubenswrapper[4725]: I0227 06:46:02.389213 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" event={"ID":"4e720dc9-93d2-4b2a-8a84-987f8f987324","Type":"ContainerStarted","Data":"e9f8dadc1599cc5f42da0d8b897acab3121df43bb5622a2de3bb267e28c858da"} Feb 27 06:46:02 crc kubenswrapper[4725]: I0227 06:46:02.404889 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" podStartSLOduration=1.388335347 podStartE2EDuration="2.404871952s" podCreationTimestamp="2026-02-27 06:46:00 +0000 UTC" firstStartedPulling="2026-02-27 06:46:00.966951429 +0000 UTC m=+2139.429572038" lastFinishedPulling="2026-02-27 06:46:01.983488074 +0000 UTC m=+2140.446108643" observedRunningTime="2026-02-27 06:46:02.402485565 +0000 UTC m=+2140.865106154" watchObservedRunningTime="2026-02-27 06:46:02.404871952 +0000 UTC m=+2140.867492521" Feb 27 06:46:03 crc kubenswrapper[4725]: I0227 06:46:03.403234 4725 generic.go:334] "Generic (PLEG): container finished" podID="4e720dc9-93d2-4b2a-8a84-987f8f987324" containerID="e9f8dadc1599cc5f42da0d8b897acab3121df43bb5622a2de3bb267e28c858da" exitCode=0 Feb 27 06:46:03 crc kubenswrapper[4725]: I0227 06:46:03.403345 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" event={"ID":"4e720dc9-93d2-4b2a-8a84-987f8f987324","Type":"ContainerDied","Data":"e9f8dadc1599cc5f42da0d8b897acab3121df43bb5622a2de3bb267e28c858da"} Feb 27 06:46:03 crc kubenswrapper[4725]: I0227 06:46:03.491824 4725 scope.go:117] "RemoveContainer" containerID="6f3f0781b6836ad165a8d34c5aacfb80cf7fb492aa0d8b82cf018e02b33cbde1" Feb 27 06:46:04 crc kubenswrapper[4725]: I0227 06:46:04.420011 4725 generic.go:334] "Generic (PLEG): container finished" podID="8732fb9d-c8a7-4cb3-acca-83301a2c03dc" containerID="1a480104a58a52f4560b1f7472e6db5fd149d35098acb01dde4704875aeb96ff" exitCode=0 Feb 27 06:46:04 crc kubenswrapper[4725]: I0227 06:46:04.420081 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" event={"ID":"8732fb9d-c8a7-4cb3-acca-83301a2c03dc","Type":"ContainerDied","Data":"1a480104a58a52f4560b1f7472e6db5fd149d35098acb01dde4704875aeb96ff"} Feb 27 06:46:04 crc kubenswrapper[4725]: I0227 06:46:04.742926 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" Feb 27 06:46:04 crc kubenswrapper[4725]: I0227 06:46:04.872000 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d57m\" (UniqueName: \"kubernetes.io/projected/4e720dc9-93d2-4b2a-8a84-987f8f987324-kube-api-access-5d57m\") pod \"4e720dc9-93d2-4b2a-8a84-987f8f987324\" (UID: \"4e720dc9-93d2-4b2a-8a84-987f8f987324\") " Feb 27 06:46:04 crc kubenswrapper[4725]: I0227 06:46:04.878099 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e720dc9-93d2-4b2a-8a84-987f8f987324-kube-api-access-5d57m" (OuterVolumeSpecName: "kube-api-access-5d57m") pod "4e720dc9-93d2-4b2a-8a84-987f8f987324" (UID: "4e720dc9-93d2-4b2a-8a84-987f8f987324"). InnerVolumeSpecName "kube-api-access-5d57m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:46:04 crc kubenswrapper[4725]: I0227 06:46:04.975336 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d57m\" (UniqueName: \"kubernetes.io/projected/4e720dc9-93d2-4b2a-8a84-987f8f987324-kube-api-access-5d57m\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.368228 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536240-c7jgl"] Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.377163 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536240-c7jgl"] Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.431157 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.431252 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536246-t6rcx" event={"ID":"4e720dc9-93d2-4b2a-8a84-987f8f987324","Type":"ContainerDied","Data":"1273945e276725a90d92d1a9d058aea3a467c987d702a40c8eb1e9b5b1e687a1"} Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.432179 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1273945e276725a90d92d1a9d058aea3a467c987d702a40c8eb1e9b5b1e687a1" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.833920 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901669 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ssh-key-openstack-edpm-ipam\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901716 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnt6j\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-kube-api-access-nnt6j\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901757 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ovn-combined-ca-bundle\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901788 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901858 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-neutron-metadata-combined-ca-bundle\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901894 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901947 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-nova-combined-ca-bundle\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.901972 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-inventory\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.902007 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-telemetry-combined-ca-bundle\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.902032 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-bootstrap-combined-ca-bundle\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.902086 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.902214 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-libvirt-combined-ca-bundle\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.902252 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-repo-setup-combined-ca-bundle\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.902327 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\" (UID: \"8732fb9d-c8a7-4cb3-acca-83301a2c03dc\") " Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.908581 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.910150 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.912694 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.912755 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.913430 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.914007 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.914081 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-kube-api-access-nnt6j" (OuterVolumeSpecName: "kube-api-access-nnt6j") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "kube-api-access-nnt6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.914977 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.915229 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.915605 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.929688 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.929754 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.947647 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:05 crc kubenswrapper[4725]: I0227 06:46:05.958090 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-inventory" (OuterVolumeSpecName: "inventory") pod "8732fb9d-c8a7-4cb3-acca-83301a2c03dc" (UID: "8732fb9d-c8a7-4cb3-acca-83301a2c03dc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004195 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004223 4725 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004233 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004242 4725 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004252 4725 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004261 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004270 4725 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004279 4725 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004305 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004315 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004325 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnt6j\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-kube-api-access-nnt6j\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004334 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004342 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.004351 4725 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8732fb9d-c8a7-4cb3-acca-83301a2c03dc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.269633 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b9899d-9ed8-41a3-a1c5-c70459205e2a" path="/var/lib/kubelet/pods/d9b9899d-9ed8-41a3-a1c5-c70459205e2a/volumes" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.448877 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" event={"ID":"8732fb9d-c8a7-4cb3-acca-83301a2c03dc","Type":"ContainerDied","Data":"302584915a8551157cf09b3f0d59fbe6b16d6fd6879b9969492c8fe756c17fe6"} Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.450447 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302584915a8551157cf09b3f0d59fbe6b16d6fd6879b9969492c8fe756c17fe6" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.450711 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.623484 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz"] Feb 27 06:46:06 crc kubenswrapper[4725]: E0227 06:46:06.623891 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e720dc9-93d2-4b2a-8a84-987f8f987324" containerName="oc" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.623908 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e720dc9-93d2-4b2a-8a84-987f8f987324" containerName="oc" Feb 27 06:46:06 crc kubenswrapper[4725]: E0227 06:46:06.623925 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8732fb9d-c8a7-4cb3-acca-83301a2c03dc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.623933 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8732fb9d-c8a7-4cb3-acca-83301a2c03dc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.624110 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e720dc9-93d2-4b2a-8a84-987f8f987324" containerName="oc" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.624129 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8732fb9d-c8a7-4cb3-acca-83301a2c03dc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.624831 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.628130 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.629007 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.629422 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.629500 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.629505 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.653346 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz"] Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.718679 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/741a3436-861d-4cb0-925e-597423d841a9-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.718754 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhdf\" (UniqueName: \"kubernetes.io/projected/741a3436-861d-4cb0-925e-597423d841a9-kube-api-access-flhdf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.719103 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.719165 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.719266 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.821545 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.821578 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.821604 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.821654 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/741a3436-861d-4cb0-925e-597423d841a9-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.821685 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhdf\" (UniqueName: \"kubernetes.io/projected/741a3436-861d-4cb0-925e-597423d841a9-kube-api-access-flhdf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.825895 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.826143 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.826972 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/741a3436-861d-4cb0-925e-597423d841a9-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.832209 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.845931 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhdf\" (UniqueName: \"kubernetes.io/projected/741a3436-861d-4cb0-925e-597423d841a9-kube-api-access-flhdf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5c4sz\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:06 crc kubenswrapper[4725]: I0227 06:46:06.946121 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:46:07 crc kubenswrapper[4725]: I0227 06:46:07.296324 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz"] Feb 27 06:46:07 crc kubenswrapper[4725]: I0227 06:46:07.459880 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" event={"ID":"741a3436-861d-4cb0-925e-597423d841a9","Type":"ContainerStarted","Data":"e31cd92cb22da8859fa5d571aed24e97f3b44296ff7d449290880b238807d8bf"} Feb 27 06:46:08 crc kubenswrapper[4725]: I0227 06:46:08.469375 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" event={"ID":"741a3436-861d-4cb0-925e-597423d841a9","Type":"ContainerStarted","Data":"de6da35fa3a081c361a9f5aed0835d812f11abd3d5203b65525696ccd7bfcad2"} Feb 27 06:46:08 crc kubenswrapper[4725]: I0227 06:46:08.486275 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" podStartSLOduration=1.943239655 podStartE2EDuration="2.486258322s" podCreationTimestamp="2026-02-27 06:46:06 +0000 UTC" firstStartedPulling="2026-02-27 06:46:07.305493651 +0000 UTC m=+2145.768114210" lastFinishedPulling="2026-02-27 06:46:07.848512278 +0000 UTC m=+2146.311132877" observedRunningTime="2026-02-27 06:46:08.482413543 +0000 UTC m=+2146.945034112" watchObservedRunningTime="2026-02-27 06:46:08.486258322 +0000 UTC m=+2146.948878891" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.015302 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4lpc7"] Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.018054 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.036774 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4lpc7"] Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.078197 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-catalog-content\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.078308 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-utilities\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.078478 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mmtq\" (UniqueName: \"kubernetes.io/projected/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-kube-api-access-6mmtq\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.180884 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-catalog-content\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.180986 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-utilities\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.181039 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mmtq\" (UniqueName: \"kubernetes.io/projected/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-kube-api-access-6mmtq\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.181439 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-catalog-content\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.181480 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-utilities\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.204906 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mmtq\" (UniqueName: \"kubernetes.io/projected/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-kube-api-access-6mmtq\") pod \"redhat-operators-4lpc7\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.396212 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:26 crc kubenswrapper[4725]: I0227 06:46:26.925713 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4lpc7"] Feb 27 06:46:27 crc kubenswrapper[4725]: I0227 06:46:27.674382 4725 generic.go:334] "Generic (PLEG): container finished" podID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerID="cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df" exitCode=0 Feb 27 06:46:27 crc kubenswrapper[4725]: I0227 06:46:27.674434 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lpc7" event={"ID":"ad1a971d-2eb1-43f9-9563-2c9a41fe1769","Type":"ContainerDied","Data":"cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df"} Feb 27 06:46:27 crc kubenswrapper[4725]: I0227 06:46:27.674788 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lpc7" event={"ID":"ad1a971d-2eb1-43f9-9563-2c9a41fe1769","Type":"ContainerStarted","Data":"b324e14876f6d8a3c3e93ae3c40ff30ba847f783ffc3f23517cbab9ee8844b2c"} Feb 27 06:46:28 crc kubenswrapper[4725]: I0227 06:46:28.711597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lpc7" event={"ID":"ad1a971d-2eb1-43f9-9563-2c9a41fe1769","Type":"ContainerStarted","Data":"fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc"} Feb 27 06:46:34 crc kubenswrapper[4725]: I0227 06:46:34.769924 4725 generic.go:334] "Generic (PLEG): container finished" podID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerID="fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc" exitCode=0 Feb 27 06:46:34 crc kubenswrapper[4725]: I0227 06:46:34.769995 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lpc7" event={"ID":"ad1a971d-2eb1-43f9-9563-2c9a41fe1769","Type":"ContainerDied","Data":"fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc"} Feb 27 06:46:35 crc kubenswrapper[4725]: I0227 06:46:35.787899 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lpc7" event={"ID":"ad1a971d-2eb1-43f9-9563-2c9a41fe1769","Type":"ContainerStarted","Data":"6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484"} Feb 27 06:46:35 crc kubenswrapper[4725]: I0227 06:46:35.820135 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4lpc7" podStartSLOduration=3.349669685 podStartE2EDuration="10.820117382s" podCreationTimestamp="2026-02-27 06:46:25 +0000 UTC" firstStartedPulling="2026-02-27 06:46:27.676273021 +0000 UTC m=+2166.138893580" lastFinishedPulling="2026-02-27 06:46:35.146720668 +0000 UTC m=+2173.609341277" observedRunningTime="2026-02-27 06:46:35.812532947 +0000 UTC m=+2174.275153526" watchObservedRunningTime="2026-02-27 06:46:35.820117382 +0000 UTC m=+2174.282737951" Feb 27 06:46:36 crc kubenswrapper[4725]: I0227 06:46:36.396707 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:36 crc kubenswrapper[4725]: I0227 06:46:36.396815 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:37 crc kubenswrapper[4725]: I0227 06:46:37.458443 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4lpc7" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="registry-server" probeResult="failure" output=< Feb 27 06:46:37 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:46:37 crc kubenswrapper[4725]: > Feb 27 06:46:47 crc kubenswrapper[4725]: I0227 06:46:47.443528 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4lpc7" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="registry-server" probeResult="failure" output=< Feb 27 06:46:47 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:46:47 crc kubenswrapper[4725]: > Feb 27 06:46:56 crc kubenswrapper[4725]: I0227 06:46:56.461255 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:56 crc kubenswrapper[4725]: I0227 06:46:56.515796 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:57 crc kubenswrapper[4725]: I0227 06:46:57.220640 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4lpc7"] Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.042742 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4lpc7" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="registry-server" containerID="cri-o://6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484" gracePeriod=2 Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.614470 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.739754 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-catalog-content\") pod \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.740097 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-utilities\") pod \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.740148 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mmtq\" (UniqueName: \"kubernetes.io/projected/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-kube-api-access-6mmtq\") pod \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\" (UID: \"ad1a971d-2eb1-43f9-9563-2c9a41fe1769\") " Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.741074 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-utilities" (OuterVolumeSpecName: "utilities") pod "ad1a971d-2eb1-43f9-9563-2c9a41fe1769" (UID: "ad1a971d-2eb1-43f9-9563-2c9a41fe1769"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.750744 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-kube-api-access-6mmtq" (OuterVolumeSpecName: "kube-api-access-6mmtq") pod "ad1a971d-2eb1-43f9-9563-2c9a41fe1769" (UID: "ad1a971d-2eb1-43f9-9563-2c9a41fe1769"). InnerVolumeSpecName "kube-api-access-6mmtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.843127 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.843176 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mmtq\" (UniqueName: \"kubernetes.io/projected/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-kube-api-access-6mmtq\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.878191 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad1a971d-2eb1-43f9-9563-2c9a41fe1769" (UID: "ad1a971d-2eb1-43f9-9563-2c9a41fe1769"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:46:58 crc kubenswrapper[4725]: I0227 06:46:58.945676 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1a971d-2eb1-43f9-9563-2c9a41fe1769-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.055339 4725 generic.go:334] "Generic (PLEG): container finished" podID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerID="6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484" exitCode=0 Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.055388 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lpc7" event={"ID":"ad1a971d-2eb1-43f9-9563-2c9a41fe1769","Type":"ContainerDied","Data":"6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484"} Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.055397 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lpc7" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.055427 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lpc7" event={"ID":"ad1a971d-2eb1-43f9-9563-2c9a41fe1769","Type":"ContainerDied","Data":"b324e14876f6d8a3c3e93ae3c40ff30ba847f783ffc3f23517cbab9ee8844b2c"} Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.055450 4725 scope.go:117] "RemoveContainer" containerID="6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.080269 4725 scope.go:117] "RemoveContainer" containerID="fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.095940 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4lpc7"] Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.106800 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4lpc7"] Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.126356 4725 scope.go:117] "RemoveContainer" containerID="cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.164627 4725 scope.go:117] "RemoveContainer" containerID="6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484" Feb 27 06:46:59 crc kubenswrapper[4725]: E0227 06:46:59.165079 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484\": container with ID starting with 6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484 not found: ID does not exist" containerID="6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.165129 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484"} err="failed to get container status \"6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484\": rpc error: code = NotFound desc = could not find container \"6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484\": container with ID starting with 6cad952faee3f313c82de9ab673621cb18cc1e87bfc18f8991415b0e63aaf484 not found: ID does not exist" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.165161 4725 scope.go:117] "RemoveContainer" containerID="fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc" Feb 27 06:46:59 crc kubenswrapper[4725]: E0227 06:46:59.165585 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc\": container with ID starting with fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc not found: ID does not exist" containerID="fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.165621 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc"} err="failed to get container status \"fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc\": rpc error: code = NotFound desc = could not find container \"fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc\": container with ID starting with fe6ec01abdc16253eb63e7f807cb50ce5ade1f9e8a286645281b11464008efcc not found: ID does not exist" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.165672 4725 scope.go:117] "RemoveContainer" containerID="cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df" Feb 27 06:46:59 crc kubenswrapper[4725]: E0227 06:46:59.166029 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df\": container with ID starting with cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df not found: ID does not exist" containerID="cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df" Feb 27 06:46:59 crc kubenswrapper[4725]: I0227 06:46:59.166061 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df"} err="failed to get container status \"cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df\": rpc error: code = NotFound desc = could not find container \"cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df\": container with ID starting with cdffa93a3ad40d39794d1e5e64c5d253ba072dd0edbaa0c2ba413c887f6568df not found: ID does not exist" Feb 27 06:47:00 crc kubenswrapper[4725]: I0227 06:47:00.276110 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" path="/var/lib/kubelet/pods/ad1a971d-2eb1-43f9-9563-2c9a41fe1769/volumes" Feb 27 06:47:02 crc kubenswrapper[4725]: I0227 06:47:02.582465 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:47:02 crc kubenswrapper[4725]: I0227 06:47:02.582926 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:47:03 crc kubenswrapper[4725]: I0227 06:47:03.602960 4725 scope.go:117] "RemoveContainer" containerID="cc901290260babac09388ab371c70964b7aaa477716e23601ce2a10d1c99cfe9" Feb 27 06:47:17 crc kubenswrapper[4725]: I0227 06:47:17.273378 4725 generic.go:334] "Generic (PLEG): container finished" podID="741a3436-861d-4cb0-925e-597423d841a9" containerID="de6da35fa3a081c361a9f5aed0835d812f11abd3d5203b65525696ccd7bfcad2" exitCode=0 Feb 27 06:47:17 crc kubenswrapper[4725]: I0227 06:47:17.273528 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" event={"ID":"741a3436-861d-4cb0-925e-597423d841a9","Type":"ContainerDied","Data":"de6da35fa3a081c361a9f5aed0835d812f11abd3d5203b65525696ccd7bfcad2"} Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.729256 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.907958 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ssh-key-openstack-edpm-ipam\") pod \"741a3436-861d-4cb0-925e-597423d841a9\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.908073 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ovn-combined-ca-bundle\") pod \"741a3436-861d-4cb0-925e-597423d841a9\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.908191 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flhdf\" (UniqueName: \"kubernetes.io/projected/741a3436-861d-4cb0-925e-597423d841a9-kube-api-access-flhdf\") pod \"741a3436-861d-4cb0-925e-597423d841a9\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.908240 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/741a3436-861d-4cb0-925e-597423d841a9-ovncontroller-config-0\") pod \"741a3436-861d-4cb0-925e-597423d841a9\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.908271 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-inventory\") pod \"741a3436-861d-4cb0-925e-597423d841a9\" (UID: \"741a3436-861d-4cb0-925e-597423d841a9\") " Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.915058 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "741a3436-861d-4cb0-925e-597423d841a9" (UID: "741a3436-861d-4cb0-925e-597423d841a9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.925678 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741a3436-861d-4cb0-925e-597423d841a9-kube-api-access-flhdf" (OuterVolumeSpecName: "kube-api-access-flhdf") pod "741a3436-861d-4cb0-925e-597423d841a9" (UID: "741a3436-861d-4cb0-925e-597423d841a9"). InnerVolumeSpecName "kube-api-access-flhdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.956933 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-inventory" (OuterVolumeSpecName: "inventory") pod "741a3436-861d-4cb0-925e-597423d841a9" (UID: "741a3436-861d-4cb0-925e-597423d841a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.957114 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "741a3436-861d-4cb0-925e-597423d841a9" (UID: "741a3436-861d-4cb0-925e-597423d841a9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:18.965527 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/741a3436-861d-4cb0-925e-597423d841a9-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "741a3436-861d-4cb0-925e-597423d841a9" (UID: "741a3436-861d-4cb0-925e-597423d841a9"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.012482 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.012505 4725 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.012514 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flhdf\" (UniqueName: \"kubernetes.io/projected/741a3436-861d-4cb0-925e-597423d841a9-kube-api-access-flhdf\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.012522 4725 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/741a3436-861d-4cb0-925e-597423d841a9-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.012531 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/741a3436-861d-4cb0-925e-597423d841a9-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.298930 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" event={"ID":"741a3436-861d-4cb0-925e-597423d841a9","Type":"ContainerDied","Data":"e31cd92cb22da8859fa5d571aed24e97f3b44296ff7d449290880b238807d8bf"} Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.298972 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e31cd92cb22da8859fa5d571aed24e97f3b44296ff7d449290880b238807d8bf" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.299046 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5c4sz" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.420693 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5"] Feb 27 06:47:19 crc kubenswrapper[4725]: E0227 06:47:19.421323 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="registry-server" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.421355 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="registry-server" Feb 27 06:47:19 crc kubenswrapper[4725]: E0227 06:47:19.421393 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="extract-utilities" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.421406 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="extract-utilities" Feb 27 06:47:19 crc kubenswrapper[4725]: E0227 06:47:19.421419 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="extract-content" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.421430 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="extract-content" Feb 27 06:47:19 crc kubenswrapper[4725]: E0227 06:47:19.421463 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741a3436-861d-4cb0-925e-597423d841a9" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.421474 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="741a3436-861d-4cb0-925e-597423d841a9" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.421780 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1a971d-2eb1-43f9-9563-2c9a41fe1769" containerName="registry-server" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.421802 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="741a3436-861d-4cb0-925e-597423d841a9" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.422846 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.426530 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.426530 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.426845 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.427404 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.428009 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.429132 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.434007 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5"] Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.527395 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.527459 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.527483 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.527520 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gct4\" (UniqueName: \"kubernetes.io/projected/b5bb9130-cfdc-481b-8e8a-c72f5562b963-kube-api-access-7gct4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.527768 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.527828 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.629373 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.629440 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.629520 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.629568 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.629598 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.629641 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gct4\" (UniqueName: \"kubernetes.io/projected/b5bb9130-cfdc-481b-8e8a-c72f5562b963-kube-api-access-7gct4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.634512 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.635065 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.635886 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.636534 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.641102 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.654506 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gct4\" (UniqueName: \"kubernetes.io/projected/b5bb9130-cfdc-481b-8e8a-c72f5562b963-kube-api-access-7gct4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:19 crc kubenswrapper[4725]: I0227 06:47:19.750169 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:47:20 crc kubenswrapper[4725]: I0227 06:47:20.366723 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5"] Feb 27 06:47:20 crc kubenswrapper[4725]: W0227 06:47:20.378193 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5bb9130_cfdc_481b_8e8a_c72f5562b963.slice/crio-e5373b75ecad35c57c16ec1e6d4357aad2e9ee05c7241fef449719cbe2e6831d WatchSource:0}: Error finding container e5373b75ecad35c57c16ec1e6d4357aad2e9ee05c7241fef449719cbe2e6831d: Status 404 returned error can't find the container with id e5373b75ecad35c57c16ec1e6d4357aad2e9ee05c7241fef449719cbe2e6831d Feb 27 06:47:20 crc kubenswrapper[4725]: I0227 06:47:20.380843 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:47:21 crc kubenswrapper[4725]: I0227 06:47:21.319151 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" event={"ID":"b5bb9130-cfdc-481b-8e8a-c72f5562b963","Type":"ContainerStarted","Data":"6bbd3797e97921d8dfd339c37aaa6338dc269e7380d678eb2ab91715897e411f"} Feb 27 06:47:21 crc kubenswrapper[4725]: I0227 06:47:21.319497 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" event={"ID":"b5bb9130-cfdc-481b-8e8a-c72f5562b963","Type":"ContainerStarted","Data":"e5373b75ecad35c57c16ec1e6d4357aad2e9ee05c7241fef449719cbe2e6831d"} Feb 27 06:47:21 crc kubenswrapper[4725]: I0227 06:47:21.343264 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" podStartSLOduration=1.8874328889999998 podStartE2EDuration="2.343246314s" podCreationTimestamp="2026-02-27 06:47:19 +0000 UTC" firstStartedPulling="2026-02-27 06:47:20.380665739 +0000 UTC m=+2218.843286308" lastFinishedPulling="2026-02-27 06:47:20.836479164 +0000 UTC m=+2219.299099733" observedRunningTime="2026-02-27 06:47:21.341868065 +0000 UTC m=+2219.804488644" watchObservedRunningTime="2026-02-27 06:47:21.343246314 +0000 UTC m=+2219.805866883" Feb 27 06:47:32 crc kubenswrapper[4725]: I0227 06:47:32.555074 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:47:32 crc kubenswrapper[4725]: I0227 06:47:32.555815 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.519599 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvccb"] Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.523075 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.544449 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvccb"] Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.645599 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpt4q\" (UniqueName: \"kubernetes.io/projected/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-kube-api-access-mpt4q\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.645680 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-utilities\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.645713 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-catalog-content\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.747515 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpt4q\" (UniqueName: \"kubernetes.io/projected/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-kube-api-access-mpt4q\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.747604 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-utilities\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.747639 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-catalog-content\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.748336 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-catalog-content\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.748464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-utilities\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.771444 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpt4q\" (UniqueName: \"kubernetes.io/projected/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-kube-api-access-mpt4q\") pod \"community-operators-hvccb\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:34 crc kubenswrapper[4725]: I0227 06:47:34.855340 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:35 crc kubenswrapper[4725]: I0227 06:47:35.381386 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvccb"] Feb 27 06:47:35 crc kubenswrapper[4725]: I0227 06:47:35.459168 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccb" event={"ID":"03d04e74-7cc7-4f7e-88ad-ec611abfb6da","Type":"ContainerStarted","Data":"632c1653f27ef4a4027acaf3af2d50cbf026d9f0680b7b1ee9f509cdfe3186d4"} Feb 27 06:47:36 crc kubenswrapper[4725]: I0227 06:47:36.477597 4725 generic.go:334] "Generic (PLEG): container finished" podID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerID="9b80eb7a2cc9da8b7855aa0d17785a26ad01f991bc2ebe9a231d10e4bf8c0d79" exitCode=0 Feb 27 06:47:36 crc kubenswrapper[4725]: I0227 06:47:36.477687 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccb" event={"ID":"03d04e74-7cc7-4f7e-88ad-ec611abfb6da","Type":"ContainerDied","Data":"9b80eb7a2cc9da8b7855aa0d17785a26ad01f991bc2ebe9a231d10e4bf8c0d79"} Feb 27 06:47:37 crc kubenswrapper[4725]: I0227 06:47:37.491922 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccb" event={"ID":"03d04e74-7cc7-4f7e-88ad-ec611abfb6da","Type":"ContainerStarted","Data":"339f25fb655f483329c7ae09556c2bb9a59156ebb7e5bfbb2c950f10ef91f63b"} Feb 27 06:47:39 crc kubenswrapper[4725]: I0227 06:47:39.515597 4725 generic.go:334] "Generic (PLEG): container finished" podID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerID="339f25fb655f483329c7ae09556c2bb9a59156ebb7e5bfbb2c950f10ef91f63b" exitCode=0 Feb 27 06:47:39 crc kubenswrapper[4725]: I0227 06:47:39.515728 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccb" event={"ID":"03d04e74-7cc7-4f7e-88ad-ec611abfb6da","Type":"ContainerDied","Data":"339f25fb655f483329c7ae09556c2bb9a59156ebb7e5bfbb2c950f10ef91f63b"} Feb 27 06:47:40 crc kubenswrapper[4725]: I0227 06:47:40.530207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccb" event={"ID":"03d04e74-7cc7-4f7e-88ad-ec611abfb6da","Type":"ContainerStarted","Data":"fb6141e4c8835676bf4082a53051313522a050cf27b753440de903012cdb82e1"} Feb 27 06:47:40 crc kubenswrapper[4725]: I0227 06:47:40.553947 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvccb" podStartSLOduration=3.108924265 podStartE2EDuration="6.553669101s" podCreationTimestamp="2026-02-27 06:47:34 +0000 UTC" firstStartedPulling="2026-02-27 06:47:36.480961429 +0000 UTC m=+2234.943582028" lastFinishedPulling="2026-02-27 06:47:39.925706305 +0000 UTC m=+2238.388326864" observedRunningTime="2026-02-27 06:47:40.549854563 +0000 UTC m=+2239.012475132" watchObservedRunningTime="2026-02-27 06:47:40.553669101 +0000 UTC m=+2239.016289670" Feb 27 06:47:44 crc kubenswrapper[4725]: I0227 06:47:44.855817 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:44 crc kubenswrapper[4725]: I0227 06:47:44.856495 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:44 crc kubenswrapper[4725]: I0227 06:47:44.931785 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:45 crc kubenswrapper[4725]: I0227 06:47:45.676732 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:45 crc kubenswrapper[4725]: I0227 06:47:45.739846 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvccb"] Feb 27 06:47:47 crc kubenswrapper[4725]: I0227 06:47:47.626461 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvccb" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="registry-server" containerID="cri-o://fb6141e4c8835676bf4082a53051313522a050cf27b753440de903012cdb82e1" gracePeriod=2 Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.635873 4725 generic.go:334] "Generic (PLEG): container finished" podID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerID="fb6141e4c8835676bf4082a53051313522a050cf27b753440de903012cdb82e1" exitCode=0 Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.636174 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccb" event={"ID":"03d04e74-7cc7-4f7e-88ad-ec611abfb6da","Type":"ContainerDied","Data":"fb6141e4c8835676bf4082a53051313522a050cf27b753440de903012cdb82e1"} Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.755969 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.894896 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpt4q\" (UniqueName: \"kubernetes.io/projected/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-kube-api-access-mpt4q\") pod \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.895011 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-catalog-content\") pod \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.895098 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-utilities\") pod \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\" (UID: \"03d04e74-7cc7-4f7e-88ad-ec611abfb6da\") " Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.896140 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-utilities" (OuterVolumeSpecName: "utilities") pod "03d04e74-7cc7-4f7e-88ad-ec611abfb6da" (UID: "03d04e74-7cc7-4f7e-88ad-ec611abfb6da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.905607 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-kube-api-access-mpt4q" (OuterVolumeSpecName: "kube-api-access-mpt4q") pod "03d04e74-7cc7-4f7e-88ad-ec611abfb6da" (UID: "03d04e74-7cc7-4f7e-88ad-ec611abfb6da"). InnerVolumeSpecName "kube-api-access-mpt4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.956687 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03d04e74-7cc7-4f7e-88ad-ec611abfb6da" (UID: "03d04e74-7cc7-4f7e-88ad-ec611abfb6da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.997040 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpt4q\" (UniqueName: \"kubernetes.io/projected/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-kube-api-access-mpt4q\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.997072 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:48 crc kubenswrapper[4725]: I0227 06:47:48.997081 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03d04e74-7cc7-4f7e-88ad-ec611abfb6da-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:47:49 crc kubenswrapper[4725]: I0227 06:47:49.649140 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccb" event={"ID":"03d04e74-7cc7-4f7e-88ad-ec611abfb6da","Type":"ContainerDied","Data":"632c1653f27ef4a4027acaf3af2d50cbf026d9f0680b7b1ee9f509cdfe3186d4"} Feb 27 06:47:49 crc kubenswrapper[4725]: I0227 06:47:49.650268 4725 scope.go:117] "RemoveContainer" containerID="fb6141e4c8835676bf4082a53051313522a050cf27b753440de903012cdb82e1" Feb 27 06:47:49 crc kubenswrapper[4725]: I0227 06:47:49.649235 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccb" Feb 27 06:47:49 crc kubenswrapper[4725]: I0227 06:47:49.678415 4725 scope.go:117] "RemoveContainer" containerID="339f25fb655f483329c7ae09556c2bb9a59156ebb7e5bfbb2c950f10ef91f63b" Feb 27 06:47:49 crc kubenswrapper[4725]: I0227 06:47:49.691083 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvccb"] Feb 27 06:47:49 crc kubenswrapper[4725]: I0227 06:47:49.698856 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvccb"] Feb 27 06:47:49 crc kubenswrapper[4725]: I0227 06:47:49.723852 4725 scope.go:117] "RemoveContainer" containerID="9b80eb7a2cc9da8b7855aa0d17785a26ad01f991bc2ebe9a231d10e4bf8c0d79" Feb 27 06:47:50 crc kubenswrapper[4725]: I0227 06:47:50.272085 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" path="/var/lib/kubelet/pods/03d04e74-7cc7-4f7e-88ad-ec611abfb6da/volumes" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.169718 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536248-k6725"] Feb 27 06:48:00 crc kubenswrapper[4725]: E0227 06:48:00.170744 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="extract-utilities" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.170763 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="extract-utilities" Feb 27 06:48:00 crc kubenswrapper[4725]: E0227 06:48:00.170800 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="registry-server" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.170809 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="registry-server" Feb 27 06:48:00 crc kubenswrapper[4725]: E0227 06:48:00.170820 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="extract-content" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.170831 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="extract-content" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.171087 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d04e74-7cc7-4f7e-88ad-ec611abfb6da" containerName="registry-server" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.171918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536248-k6725" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.173940 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.173953 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.174861 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.185612 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536248-k6725"] Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.324385 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnstm\" (UniqueName: \"kubernetes.io/projected/84137321-69dc-4bf7-a4e8-4d3a3ff6600d-kube-api-access-pnstm\") pod \"auto-csr-approver-29536248-k6725\" (UID: \"84137321-69dc-4bf7-a4e8-4d3a3ff6600d\") " pod="openshift-infra/auto-csr-approver-29536248-k6725" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.426368 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnstm\" (UniqueName: \"kubernetes.io/projected/84137321-69dc-4bf7-a4e8-4d3a3ff6600d-kube-api-access-pnstm\") pod \"auto-csr-approver-29536248-k6725\" (UID: \"84137321-69dc-4bf7-a4e8-4d3a3ff6600d\") " pod="openshift-infra/auto-csr-approver-29536248-k6725" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.450205 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnstm\" (UniqueName: \"kubernetes.io/projected/84137321-69dc-4bf7-a4e8-4d3a3ff6600d-kube-api-access-pnstm\") pod \"auto-csr-approver-29536248-k6725\" (UID: \"84137321-69dc-4bf7-a4e8-4d3a3ff6600d\") " pod="openshift-infra/auto-csr-approver-29536248-k6725" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.502604 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536248-k6725" Feb 27 06:48:00 crc kubenswrapper[4725]: I0227 06:48:00.978005 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536248-k6725"] Feb 27 06:48:01 crc kubenswrapper[4725]: I0227 06:48:01.770977 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536248-k6725" event={"ID":"84137321-69dc-4bf7-a4e8-4d3a3ff6600d","Type":"ContainerStarted","Data":"6098d5ceafe87cef0a6fb16230d98431ee9590399e8cc651bd3a3f4060fb240c"} Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.554483 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.554786 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.554836 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.555714 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.555780 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" gracePeriod=600 Feb 27 06:48:02 crc kubenswrapper[4725]: E0227 06:48:02.686069 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.780436 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" exitCode=0 Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.780517 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5"} Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.780602 4725 scope.go:117] "RemoveContainer" containerID="0b501af946b1ad795a1755f721e8e4751f676f233bfcb14cf18cba37c4f4ffaf" Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.781026 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:48:02 crc kubenswrapper[4725]: E0227 06:48:02.781329 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.781642 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536248-k6725" event={"ID":"84137321-69dc-4bf7-a4e8-4d3a3ff6600d","Type":"ContainerStarted","Data":"fc3db8e49e12cb8d6a3cc22a6ec83548d4bf4164202ceedd9e425f14c8d46291"} Feb 27 06:48:02 crc kubenswrapper[4725]: I0227 06:48:02.819602 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536248-k6725" podStartSLOduration=1.6141363480000002 podStartE2EDuration="2.819578598s" podCreationTimestamp="2026-02-27 06:48:00 +0000 UTC" firstStartedPulling="2026-02-27 06:48:00.977916337 +0000 UTC m=+2259.440536906" lastFinishedPulling="2026-02-27 06:48:02.183358577 +0000 UTC m=+2260.645979156" observedRunningTime="2026-02-27 06:48:02.819461244 +0000 UTC m=+2261.282081833" watchObservedRunningTime="2026-02-27 06:48:02.819578598 +0000 UTC m=+2261.282199187" Feb 27 06:48:03 crc kubenswrapper[4725]: I0227 06:48:03.798586 4725 generic.go:334] "Generic (PLEG): container finished" podID="84137321-69dc-4bf7-a4e8-4d3a3ff6600d" containerID="fc3db8e49e12cb8d6a3cc22a6ec83548d4bf4164202ceedd9e425f14c8d46291" exitCode=0 Feb 27 06:48:03 crc kubenswrapper[4725]: I0227 06:48:03.798668 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536248-k6725" event={"ID":"84137321-69dc-4bf7-a4e8-4d3a3ff6600d","Type":"ContainerDied","Data":"fc3db8e49e12cb8d6a3cc22a6ec83548d4bf4164202ceedd9e425f14c8d46291"} Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.164344 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536248-k6725" Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.319989 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnstm\" (UniqueName: \"kubernetes.io/projected/84137321-69dc-4bf7-a4e8-4d3a3ff6600d-kube-api-access-pnstm\") pod \"84137321-69dc-4bf7-a4e8-4d3a3ff6600d\" (UID: \"84137321-69dc-4bf7-a4e8-4d3a3ff6600d\") " Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.327941 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84137321-69dc-4bf7-a4e8-4d3a3ff6600d-kube-api-access-pnstm" (OuterVolumeSpecName: "kube-api-access-pnstm") pod "84137321-69dc-4bf7-a4e8-4d3a3ff6600d" (UID: "84137321-69dc-4bf7-a4e8-4d3a3ff6600d"). InnerVolumeSpecName "kube-api-access-pnstm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.355421 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536242-pjw6w"] Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.362543 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536242-pjw6w"] Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.424093 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnstm\" (UniqueName: \"kubernetes.io/projected/84137321-69dc-4bf7-a4e8-4d3a3ff6600d-kube-api-access-pnstm\") on node \"crc\" DevicePath \"\"" Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.820059 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536248-k6725" event={"ID":"84137321-69dc-4bf7-a4e8-4d3a3ff6600d","Type":"ContainerDied","Data":"6098d5ceafe87cef0a6fb16230d98431ee9590399e8cc651bd3a3f4060fb240c"} Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.820098 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6098d5ceafe87cef0a6fb16230d98431ee9590399e8cc651bd3a3f4060fb240c" Feb 27 06:48:05 crc kubenswrapper[4725]: I0227 06:48:05.820105 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536248-k6725" Feb 27 06:48:06 crc kubenswrapper[4725]: I0227 06:48:06.271185 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2fedfdb-d283-478e-bc9a-7f225d1a6d63" path="/var/lib/kubelet/pods/e2fedfdb-d283-478e-bc9a-7f225d1a6d63/volumes" Feb 27 06:48:09 crc kubenswrapper[4725]: I0227 06:48:09.861423 4725 generic.go:334] "Generic (PLEG): container finished" podID="b5bb9130-cfdc-481b-8e8a-c72f5562b963" containerID="6bbd3797e97921d8dfd339c37aaa6338dc269e7380d678eb2ab91715897e411f" exitCode=0 Feb 27 06:48:09 crc kubenswrapper[4725]: I0227 06:48:09.861477 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" event={"ID":"b5bb9130-cfdc-481b-8e8a-c72f5562b963","Type":"ContainerDied","Data":"6bbd3797e97921d8dfd339c37aaa6338dc269e7380d678eb2ab91715897e411f"} Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.309633 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.452584 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gct4\" (UniqueName: \"kubernetes.io/projected/b5bb9130-cfdc-481b-8e8a-c72f5562b963-kube-api-access-7gct4\") pod \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.452634 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-metadata-combined-ca-bundle\") pod \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.452691 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-ovn-metadata-agent-neutron-config-0\") pod \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.452907 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-ssh-key-openstack-edpm-ipam\") pod \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.452960 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-inventory\") pod \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.453069 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-nova-metadata-neutron-config-0\") pod \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\" (UID: \"b5bb9130-cfdc-481b-8e8a-c72f5562b963\") " Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.463140 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b5bb9130-cfdc-481b-8e8a-c72f5562b963" (UID: "b5bb9130-cfdc-481b-8e8a-c72f5562b963"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.465268 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5bb9130-cfdc-481b-8e8a-c72f5562b963-kube-api-access-7gct4" (OuterVolumeSpecName: "kube-api-access-7gct4") pod "b5bb9130-cfdc-481b-8e8a-c72f5562b963" (UID: "b5bb9130-cfdc-481b-8e8a-c72f5562b963"). InnerVolumeSpecName "kube-api-access-7gct4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.485407 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b5bb9130-cfdc-481b-8e8a-c72f5562b963" (UID: "b5bb9130-cfdc-481b-8e8a-c72f5562b963"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.486513 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "b5bb9130-cfdc-481b-8e8a-c72f5562b963" (UID: "b5bb9130-cfdc-481b-8e8a-c72f5562b963"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.487849 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-inventory" (OuterVolumeSpecName: "inventory") pod "b5bb9130-cfdc-481b-8e8a-c72f5562b963" (UID: "b5bb9130-cfdc-481b-8e8a-c72f5562b963"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.494857 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "b5bb9130-cfdc-481b-8e8a-c72f5562b963" (UID: "b5bb9130-cfdc-481b-8e8a-c72f5562b963"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.555047 4725 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.555106 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gct4\" (UniqueName: \"kubernetes.io/projected/b5bb9130-cfdc-481b-8e8a-c72f5562b963-kube-api-access-7gct4\") on node \"crc\" DevicePath \"\"" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.555117 4725 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.555129 4725 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.555140 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.555151 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5bb9130-cfdc-481b-8e8a-c72f5562b963-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.883515 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" event={"ID":"b5bb9130-cfdc-481b-8e8a-c72f5562b963","Type":"ContainerDied","Data":"e5373b75ecad35c57c16ec1e6d4357aad2e9ee05c7241fef449719cbe2e6831d"} Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.883557 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5373b75ecad35c57c16ec1e6d4357aad2e9ee05c7241fef449719cbe2e6831d" Feb 27 06:48:11 crc kubenswrapper[4725]: I0227 06:48:11.883603 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.014793 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr"] Feb 27 06:48:12 crc kubenswrapper[4725]: E0227 06:48:12.015408 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84137321-69dc-4bf7-a4e8-4d3a3ff6600d" containerName="oc" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.015429 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="84137321-69dc-4bf7-a4e8-4d3a3ff6600d" containerName="oc" Feb 27 06:48:12 crc kubenswrapper[4725]: E0227 06:48:12.015447 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bb9130-cfdc-481b-8e8a-c72f5562b963" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.015457 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bb9130-cfdc-481b-8e8a-c72f5562b963" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.015666 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5bb9130-cfdc-481b-8e8a-c72f5562b963" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.015687 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="84137321-69dc-4bf7-a4e8-4d3a3ff6600d" containerName="oc" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.016581 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.019039 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.019230 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.022326 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.022570 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.022692 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.025762 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr"] Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.169137 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.169221 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.169348 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvh4j\" (UniqueName: \"kubernetes.io/projected/501e41e3-55eb-4b62-b4b5-67f594761a64-kube-api-access-hvh4j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.169436 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.169768 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.278276 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.278405 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.278501 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.278658 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvh4j\" (UniqueName: \"kubernetes.io/projected/501e41e3-55eb-4b62-b4b5-67f594761a64-kube-api-access-hvh4j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.278837 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.287265 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.288232 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.291166 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.293110 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.320646 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvh4j\" (UniqueName: \"kubernetes.io/projected/501e41e3-55eb-4b62-b4b5-67f594761a64-kube-api-access-hvh4j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.346304 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:48:12 crc kubenswrapper[4725]: I0227 06:48:12.948222 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr"] Feb 27 06:48:13 crc kubenswrapper[4725]: I0227 06:48:13.913833 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" event={"ID":"501e41e3-55eb-4b62-b4b5-67f594761a64","Type":"ContainerStarted","Data":"62f11a968f7846967e6d543d24de7048310a9e153804c446881cd084dc864ebd"} Feb 27 06:48:13 crc kubenswrapper[4725]: I0227 06:48:13.914275 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" event={"ID":"501e41e3-55eb-4b62-b4b5-67f594761a64","Type":"ContainerStarted","Data":"e2837fa0668f9ad97616cdc41fd470e438a4ba4fe635dfdfc2b700e22611aa55"} Feb 27 06:48:13 crc kubenswrapper[4725]: I0227 06:48:13.938807 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" podStartSLOduration=2.522192244 podStartE2EDuration="2.938789807s" podCreationTimestamp="2026-02-27 06:48:11 +0000 UTC" firstStartedPulling="2026-02-27 06:48:12.948131566 +0000 UTC m=+2271.410752175" lastFinishedPulling="2026-02-27 06:48:13.364729129 +0000 UTC m=+2271.827349738" observedRunningTime="2026-02-27 06:48:13.932401275 +0000 UTC m=+2272.395021854" watchObservedRunningTime="2026-02-27 06:48:13.938789807 +0000 UTC m=+2272.401410376" Feb 27 06:48:16 crc kubenswrapper[4725]: I0227 06:48:16.252712 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:48:16 crc kubenswrapper[4725]: E0227 06:48:16.253640 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:48:30 crc kubenswrapper[4725]: I0227 06:48:30.253216 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:48:30 crc kubenswrapper[4725]: E0227 06:48:30.254596 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:48:41 crc kubenswrapper[4725]: I0227 06:48:41.251132 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:48:41 crc kubenswrapper[4725]: E0227 06:48:41.251958 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.499166 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sprfp"] Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.502341 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.510392 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sprfp"] Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.640251 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-utilities\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.640596 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzmk9\" (UniqueName: \"kubernetes.io/projected/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-kube-api-access-lzmk9\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.640670 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-catalog-content\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.742626 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-utilities\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.742715 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzmk9\" (UniqueName: \"kubernetes.io/projected/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-kube-api-access-lzmk9\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.742762 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-catalog-content\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.743226 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-catalog-content\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.743460 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-utilities\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.763175 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzmk9\" (UniqueName: \"kubernetes.io/projected/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-kube-api-access-lzmk9\") pod \"certified-operators-sprfp\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:52 crc kubenswrapper[4725]: I0227 06:48:52.844842 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:48:53 crc kubenswrapper[4725]: I0227 06:48:53.469832 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sprfp"] Feb 27 06:48:53 crc kubenswrapper[4725]: W0227 06:48:53.473335 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82582077_3fa3_4e9c_a3f2_e9c4fa3a0b8d.slice/crio-8e4351f3d35ffc5009224357c12c94e95243d1724ea98a18d6650fd5f13ce325 WatchSource:0}: Error finding container 8e4351f3d35ffc5009224357c12c94e95243d1724ea98a18d6650fd5f13ce325: Status 404 returned error can't find the container with id 8e4351f3d35ffc5009224357c12c94e95243d1724ea98a18d6650fd5f13ce325 Feb 27 06:48:54 crc kubenswrapper[4725]: I0227 06:48:54.395150 4725 generic.go:334] "Generic (PLEG): container finished" podID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerID="515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1" exitCode=0 Feb 27 06:48:54 crc kubenswrapper[4725]: I0227 06:48:54.395228 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sprfp" event={"ID":"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d","Type":"ContainerDied","Data":"515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1"} Feb 27 06:48:54 crc kubenswrapper[4725]: I0227 06:48:54.395444 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sprfp" event={"ID":"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d","Type":"ContainerStarted","Data":"8e4351f3d35ffc5009224357c12c94e95243d1724ea98a18d6650fd5f13ce325"} Feb 27 06:48:55 crc kubenswrapper[4725]: I0227 06:48:55.251971 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:48:55 crc kubenswrapper[4725]: E0227 06:48:55.252430 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:48:55 crc kubenswrapper[4725]: I0227 06:48:55.407777 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sprfp" event={"ID":"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d","Type":"ContainerStarted","Data":"474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d"} Feb 27 06:48:58 crc kubenswrapper[4725]: I0227 06:48:58.441242 4725 generic.go:334] "Generic (PLEG): container finished" podID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerID="474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d" exitCode=0 Feb 27 06:48:58 crc kubenswrapper[4725]: I0227 06:48:58.441359 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sprfp" event={"ID":"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d","Type":"ContainerDied","Data":"474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d"} Feb 27 06:48:59 crc kubenswrapper[4725]: I0227 06:48:59.452685 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sprfp" event={"ID":"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d","Type":"ContainerStarted","Data":"91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391"} Feb 27 06:48:59 crc kubenswrapper[4725]: I0227 06:48:59.486250 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sprfp" podStartSLOduration=3.019363016 podStartE2EDuration="7.486233896s" podCreationTimestamp="2026-02-27 06:48:52 +0000 UTC" firstStartedPulling="2026-02-27 06:48:54.397925756 +0000 UTC m=+2312.860546365" lastFinishedPulling="2026-02-27 06:48:58.864796676 +0000 UTC m=+2317.327417245" observedRunningTime="2026-02-27 06:48:59.476169851 +0000 UTC m=+2317.938790440" watchObservedRunningTime="2026-02-27 06:48:59.486233896 +0000 UTC m=+2317.948854465" Feb 27 06:49:02 crc kubenswrapper[4725]: I0227 06:49:02.845404 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:49:02 crc kubenswrapper[4725]: I0227 06:49:02.845961 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:49:02 crc kubenswrapper[4725]: I0227 06:49:02.909609 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:49:03 crc kubenswrapper[4725]: I0227 06:49:03.758640 4725 scope.go:117] "RemoveContainer" containerID="65b91a8b96cb888344c913d6927f7a13e950b1645d22a6d9c54d96ea2bcfe527" Feb 27 06:49:06 crc kubenswrapper[4725]: I0227 06:49:06.251898 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:49:06 crc kubenswrapper[4725]: E0227 06:49:06.252838 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:49:12 crc kubenswrapper[4725]: I0227 06:49:12.897201 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:49:12 crc kubenswrapper[4725]: I0227 06:49:12.964988 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sprfp"] Feb 27 06:49:13 crc kubenswrapper[4725]: I0227 06:49:13.587662 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sprfp" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="registry-server" containerID="cri-o://91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391" gracePeriod=2 Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.033206 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.136458 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-catalog-content\") pod \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.136567 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-utilities\") pod \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.136620 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzmk9\" (UniqueName: \"kubernetes.io/projected/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-kube-api-access-lzmk9\") pod \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\" (UID: \"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d\") " Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.137425 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-utilities" (OuterVolumeSpecName: "utilities") pod "82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" (UID: "82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.143140 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-kube-api-access-lzmk9" (OuterVolumeSpecName: "kube-api-access-lzmk9") pod "82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" (UID: "82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d"). InnerVolumeSpecName "kube-api-access-lzmk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.192055 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" (UID: "82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.238691 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.238722 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.238732 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzmk9\" (UniqueName: \"kubernetes.io/projected/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d-kube-api-access-lzmk9\") on node \"crc\" DevicePath \"\"" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.603579 4725 generic.go:334] "Generic (PLEG): container finished" podID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerID="91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391" exitCode=0 Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.603612 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sprfp" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.603634 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sprfp" event={"ID":"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d","Type":"ContainerDied","Data":"91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391"} Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.603682 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sprfp" event={"ID":"82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d","Type":"ContainerDied","Data":"8e4351f3d35ffc5009224357c12c94e95243d1724ea98a18d6650fd5f13ce325"} Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.603713 4725 scope.go:117] "RemoveContainer" containerID="91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.629494 4725 scope.go:117] "RemoveContainer" containerID="474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.631147 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sprfp"] Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.639968 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sprfp"] Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.657495 4725 scope.go:117] "RemoveContainer" containerID="515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.723520 4725 scope.go:117] "RemoveContainer" containerID="91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391" Feb 27 06:49:14 crc kubenswrapper[4725]: E0227 06:49:14.724119 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391\": container with ID starting with 91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391 not found: ID does not exist" containerID="91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.724151 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391"} err="failed to get container status \"91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391\": rpc error: code = NotFound desc = could not find container \"91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391\": container with ID starting with 91bb0821216a59f2faa757f139d27f9f6bdf115cf798e781b38056f20808e391 not found: ID does not exist" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.724171 4725 scope.go:117] "RemoveContainer" containerID="474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d" Feb 27 06:49:14 crc kubenswrapper[4725]: E0227 06:49:14.724593 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d\": container with ID starting with 474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d not found: ID does not exist" containerID="474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.724622 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d"} err="failed to get container status \"474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d\": rpc error: code = NotFound desc = could not find container \"474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d\": container with ID starting with 474bb3224152e0700bd8dac102a9aad881b84807398d65ff30ccfd990c5ade2d not found: ID does not exist" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.724641 4725 scope.go:117] "RemoveContainer" containerID="515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1" Feb 27 06:49:14 crc kubenswrapper[4725]: E0227 06:49:14.725014 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1\": container with ID starting with 515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1 not found: ID does not exist" containerID="515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1" Feb 27 06:49:14 crc kubenswrapper[4725]: I0227 06:49:14.725043 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1"} err="failed to get container status \"515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1\": rpc error: code = NotFound desc = could not find container \"515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1\": container with ID starting with 515ae242b9a6d0f5b9513a038987b307a5d26aa92d0dc074ce572f94e82546f1 not found: ID does not exist" Feb 27 06:49:16 crc kubenswrapper[4725]: I0227 06:49:16.266136 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" path="/var/lib/kubelet/pods/82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d/volumes" Feb 27 06:49:17 crc kubenswrapper[4725]: I0227 06:49:17.251158 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:49:17 crc kubenswrapper[4725]: E0227 06:49:17.251550 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:49:29 crc kubenswrapper[4725]: I0227 06:49:29.252385 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:49:29 crc kubenswrapper[4725]: E0227 06:49:29.253313 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:49:41 crc kubenswrapper[4725]: I0227 06:49:41.252055 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:49:41 crc kubenswrapper[4725]: E0227 06:49:41.252990 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:49:54 crc kubenswrapper[4725]: I0227 06:49:54.252259 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:49:54 crc kubenswrapper[4725]: E0227 06:49:54.253062 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.147208 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536250-lfdqn"] Feb 27 06:50:00 crc kubenswrapper[4725]: E0227 06:50:00.148430 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="registry-server" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.148455 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="registry-server" Feb 27 06:50:00 crc kubenswrapper[4725]: E0227 06:50:00.148489 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="extract-content" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.148502 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="extract-content" Feb 27 06:50:00 crc kubenswrapper[4725]: E0227 06:50:00.148547 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="extract-utilities" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.148563 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="extract-utilities" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.149003 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="82582077-3fa3-4e9c-a3f2-e9c4fa3a0b8d" containerName="registry-server" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.150207 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536250-lfdqn" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.153428 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.153694 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.153751 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.157223 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536250-lfdqn"] Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.250562 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh4pk\" (UniqueName: \"kubernetes.io/projected/0bf383bb-c952-4cbf-8f99-ffee8e8614f0-kube-api-access-sh4pk\") pod \"auto-csr-approver-29536250-lfdqn\" (UID: \"0bf383bb-c952-4cbf-8f99-ffee8e8614f0\") " pod="openshift-infra/auto-csr-approver-29536250-lfdqn" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.353005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh4pk\" (UniqueName: \"kubernetes.io/projected/0bf383bb-c952-4cbf-8f99-ffee8e8614f0-kube-api-access-sh4pk\") pod \"auto-csr-approver-29536250-lfdqn\" (UID: \"0bf383bb-c952-4cbf-8f99-ffee8e8614f0\") " pod="openshift-infra/auto-csr-approver-29536250-lfdqn" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.380058 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh4pk\" (UniqueName: \"kubernetes.io/projected/0bf383bb-c952-4cbf-8f99-ffee8e8614f0-kube-api-access-sh4pk\") pod \"auto-csr-approver-29536250-lfdqn\" (UID: \"0bf383bb-c952-4cbf-8f99-ffee8e8614f0\") " pod="openshift-infra/auto-csr-approver-29536250-lfdqn" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.467373 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536250-lfdqn" Feb 27 06:50:00 crc kubenswrapper[4725]: I0227 06:50:00.955576 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536250-lfdqn"] Feb 27 06:50:00 crc kubenswrapper[4725]: W0227 06:50:00.966402 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bf383bb_c952_4cbf_8f99_ffee8e8614f0.slice/crio-e1d1cdd53b3b277fc27b3a2d83c3e0e589d8cdda9d4f3e207cd70bb50f08f099 WatchSource:0}: Error finding container e1d1cdd53b3b277fc27b3a2d83c3e0e589d8cdda9d4f3e207cd70bb50f08f099: Status 404 returned error can't find the container with id e1d1cdd53b3b277fc27b3a2d83c3e0e589d8cdda9d4f3e207cd70bb50f08f099 Feb 27 06:50:01 crc kubenswrapper[4725]: I0227 06:50:01.123814 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536250-lfdqn" event={"ID":"0bf383bb-c952-4cbf-8f99-ffee8e8614f0","Type":"ContainerStarted","Data":"e1d1cdd53b3b277fc27b3a2d83c3e0e589d8cdda9d4f3e207cd70bb50f08f099"} Feb 27 06:50:03 crc kubenswrapper[4725]: I0227 06:50:03.151237 4725 generic.go:334] "Generic (PLEG): container finished" podID="0bf383bb-c952-4cbf-8f99-ffee8e8614f0" containerID="fecf89a29d64d1c5443a1075c6ce7911dd793143dafe7b3d403df22531ed1955" exitCode=0 Feb 27 06:50:03 crc kubenswrapper[4725]: I0227 06:50:03.151338 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536250-lfdqn" event={"ID":"0bf383bb-c952-4cbf-8f99-ffee8e8614f0","Type":"ContainerDied","Data":"fecf89a29d64d1c5443a1075c6ce7911dd793143dafe7b3d403df22531ed1955"} Feb 27 06:50:04 crc kubenswrapper[4725]: I0227 06:50:04.507219 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536250-lfdqn" Feb 27 06:50:04 crc kubenswrapper[4725]: I0227 06:50:04.562715 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh4pk\" (UniqueName: \"kubernetes.io/projected/0bf383bb-c952-4cbf-8f99-ffee8e8614f0-kube-api-access-sh4pk\") pod \"0bf383bb-c952-4cbf-8f99-ffee8e8614f0\" (UID: \"0bf383bb-c952-4cbf-8f99-ffee8e8614f0\") " Feb 27 06:50:04 crc kubenswrapper[4725]: I0227 06:50:04.568824 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf383bb-c952-4cbf-8f99-ffee8e8614f0-kube-api-access-sh4pk" (OuterVolumeSpecName: "kube-api-access-sh4pk") pod "0bf383bb-c952-4cbf-8f99-ffee8e8614f0" (UID: "0bf383bb-c952-4cbf-8f99-ffee8e8614f0"). InnerVolumeSpecName "kube-api-access-sh4pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:50:04 crc kubenswrapper[4725]: I0227 06:50:04.666731 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh4pk\" (UniqueName: \"kubernetes.io/projected/0bf383bb-c952-4cbf-8f99-ffee8e8614f0-kube-api-access-sh4pk\") on node \"crc\" DevicePath \"\"" Feb 27 06:50:05 crc kubenswrapper[4725]: I0227 06:50:05.182379 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536250-lfdqn" event={"ID":"0bf383bb-c952-4cbf-8f99-ffee8e8614f0","Type":"ContainerDied","Data":"e1d1cdd53b3b277fc27b3a2d83c3e0e589d8cdda9d4f3e207cd70bb50f08f099"} Feb 27 06:50:05 crc kubenswrapper[4725]: I0227 06:50:05.182420 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1d1cdd53b3b277fc27b3a2d83c3e0e589d8cdda9d4f3e207cd70bb50f08f099" Feb 27 06:50:05 crc kubenswrapper[4725]: I0227 06:50:05.182444 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536250-lfdqn" Feb 27 06:50:05 crc kubenswrapper[4725]: I0227 06:50:05.599088 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536244-9cfm7"] Feb 27 06:50:05 crc kubenswrapper[4725]: I0227 06:50:05.616168 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536244-9cfm7"] Feb 27 06:50:06 crc kubenswrapper[4725]: I0227 06:50:06.269344 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5de01ce-d531-45f2-bf05-f66ada293780" path="/var/lib/kubelet/pods/e5de01ce-d531-45f2-bf05-f66ada293780/volumes" Feb 27 06:50:08 crc kubenswrapper[4725]: I0227 06:50:08.251883 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:50:08 crc kubenswrapper[4725]: E0227 06:50:08.252159 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:50:20 crc kubenswrapper[4725]: I0227 06:50:20.252671 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:50:20 crc kubenswrapper[4725]: E0227 06:50:20.253910 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:50:33 crc kubenswrapper[4725]: I0227 06:50:33.251658 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:50:33 crc kubenswrapper[4725]: E0227 06:50:33.254018 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:50:48 crc kubenswrapper[4725]: I0227 06:50:48.251606 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:50:48 crc kubenswrapper[4725]: E0227 06:50:48.252310 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:51:03 crc kubenswrapper[4725]: I0227 06:51:03.251952 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:51:03 crc kubenswrapper[4725]: E0227 06:51:03.252851 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:51:03 crc kubenswrapper[4725]: I0227 06:51:03.883332 4725 scope.go:117] "RemoveContainer" containerID="874a410fe2e7ab449b90256e3396e21a9592feb7a40d64cbf9622d1eb8897daa" Feb 27 06:51:14 crc kubenswrapper[4725]: I0227 06:51:14.252015 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:51:14 crc kubenswrapper[4725]: E0227 06:51:14.253002 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:51:29 crc kubenswrapper[4725]: I0227 06:51:29.251258 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:51:29 crc kubenswrapper[4725]: E0227 06:51:29.252047 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:51:41 crc kubenswrapper[4725]: I0227 06:51:41.252202 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:51:41 crc kubenswrapper[4725]: E0227 06:51:41.254020 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:51:52 crc kubenswrapper[4725]: I0227 06:51:52.258599 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:51:52 crc kubenswrapper[4725]: E0227 06:51:52.259346 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.149521 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536252-wkxbg"] Feb 27 06:52:00 crc kubenswrapper[4725]: E0227 06:52:00.150560 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf383bb-c952-4cbf-8f99-ffee8e8614f0" containerName="oc" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.150578 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf383bb-c952-4cbf-8f99-ffee8e8614f0" containerName="oc" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.150865 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf383bb-c952-4cbf-8f99-ffee8e8614f0" containerName="oc" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.151718 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536252-wkxbg" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.154531 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.157348 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.157703 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.163045 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536252-wkxbg"] Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.266125 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhks\" (UniqueName: \"kubernetes.io/projected/8c30d4fe-9081-49a6-a7cf-368637b3fa0c-kube-api-access-bjhks\") pod \"auto-csr-approver-29536252-wkxbg\" (UID: \"8c30d4fe-9081-49a6-a7cf-368637b3fa0c\") " pod="openshift-infra/auto-csr-approver-29536252-wkxbg" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.367851 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhks\" (UniqueName: \"kubernetes.io/projected/8c30d4fe-9081-49a6-a7cf-368637b3fa0c-kube-api-access-bjhks\") pod \"auto-csr-approver-29536252-wkxbg\" (UID: \"8c30d4fe-9081-49a6-a7cf-368637b3fa0c\") " pod="openshift-infra/auto-csr-approver-29536252-wkxbg" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.386775 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhks\" (UniqueName: \"kubernetes.io/projected/8c30d4fe-9081-49a6-a7cf-368637b3fa0c-kube-api-access-bjhks\") pod \"auto-csr-approver-29536252-wkxbg\" (UID: \"8c30d4fe-9081-49a6-a7cf-368637b3fa0c\") " pod="openshift-infra/auto-csr-approver-29536252-wkxbg" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.475458 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536252-wkxbg" Feb 27 06:52:00 crc kubenswrapper[4725]: I0227 06:52:00.956335 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536252-wkxbg"] Feb 27 06:52:00 crc kubenswrapper[4725]: W0227 06:52:00.965932 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c30d4fe_9081_49a6_a7cf_368637b3fa0c.slice/crio-ae7629b266952b68a9befb714b768eb15150094136f3250f0e7aa92322a5e9b6 WatchSource:0}: Error finding container ae7629b266952b68a9befb714b768eb15150094136f3250f0e7aa92322a5e9b6: Status 404 returned error can't find the container with id ae7629b266952b68a9befb714b768eb15150094136f3250f0e7aa92322a5e9b6 Feb 27 06:52:01 crc kubenswrapper[4725]: I0227 06:52:01.422095 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536252-wkxbg" event={"ID":"8c30d4fe-9081-49a6-a7cf-368637b3fa0c","Type":"ContainerStarted","Data":"ae7629b266952b68a9befb714b768eb15150094136f3250f0e7aa92322a5e9b6"} Feb 27 06:52:03 crc kubenswrapper[4725]: I0227 06:52:03.453569 4725 generic.go:334] "Generic (PLEG): container finished" podID="8c30d4fe-9081-49a6-a7cf-368637b3fa0c" containerID="288837e3e4e3f76ded5e21704f5af7621f95172ee4b87dd6a4f989cf1f63b58f" exitCode=0 Feb 27 06:52:03 crc kubenswrapper[4725]: I0227 06:52:03.453654 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536252-wkxbg" event={"ID":"8c30d4fe-9081-49a6-a7cf-368637b3fa0c","Type":"ContainerDied","Data":"288837e3e4e3f76ded5e21704f5af7621f95172ee4b87dd6a4f989cf1f63b58f"} Feb 27 06:52:04 crc kubenswrapper[4725]: I0227 06:52:04.252374 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:52:04 crc kubenswrapper[4725]: E0227 06:52:04.253012 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:52:04 crc kubenswrapper[4725]: I0227 06:52:04.914371 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536252-wkxbg" Feb 27 06:52:05 crc kubenswrapper[4725]: I0227 06:52:05.065608 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjhks\" (UniqueName: \"kubernetes.io/projected/8c30d4fe-9081-49a6-a7cf-368637b3fa0c-kube-api-access-bjhks\") pod \"8c30d4fe-9081-49a6-a7cf-368637b3fa0c\" (UID: \"8c30d4fe-9081-49a6-a7cf-368637b3fa0c\") " Feb 27 06:52:05 crc kubenswrapper[4725]: I0227 06:52:05.072119 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c30d4fe-9081-49a6-a7cf-368637b3fa0c-kube-api-access-bjhks" (OuterVolumeSpecName: "kube-api-access-bjhks") pod "8c30d4fe-9081-49a6-a7cf-368637b3fa0c" (UID: "8c30d4fe-9081-49a6-a7cf-368637b3fa0c"). InnerVolumeSpecName "kube-api-access-bjhks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:52:05 crc kubenswrapper[4725]: I0227 06:52:05.170184 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjhks\" (UniqueName: \"kubernetes.io/projected/8c30d4fe-9081-49a6-a7cf-368637b3fa0c-kube-api-access-bjhks\") on node \"crc\" DevicePath \"\"" Feb 27 06:52:05 crc kubenswrapper[4725]: I0227 06:52:05.483712 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536252-wkxbg" event={"ID":"8c30d4fe-9081-49a6-a7cf-368637b3fa0c","Type":"ContainerDied","Data":"ae7629b266952b68a9befb714b768eb15150094136f3250f0e7aa92322a5e9b6"} Feb 27 06:52:05 crc kubenswrapper[4725]: I0227 06:52:05.483751 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7629b266952b68a9befb714b768eb15150094136f3250f0e7aa92322a5e9b6" Feb 27 06:52:05 crc kubenswrapper[4725]: I0227 06:52:05.483763 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536252-wkxbg" Feb 27 06:52:06 crc kubenswrapper[4725]: I0227 06:52:06.009948 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536246-t6rcx"] Feb 27 06:52:06 crc kubenswrapper[4725]: I0227 06:52:06.022241 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536246-t6rcx"] Feb 27 06:52:06 crc kubenswrapper[4725]: I0227 06:52:06.262760 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e720dc9-93d2-4b2a-8a84-987f8f987324" path="/var/lib/kubelet/pods/4e720dc9-93d2-4b2a-8a84-987f8f987324/volumes" Feb 27 06:52:16 crc kubenswrapper[4725]: I0227 06:52:16.641463 4725 generic.go:334] "Generic (PLEG): container finished" podID="501e41e3-55eb-4b62-b4b5-67f594761a64" containerID="62f11a968f7846967e6d543d24de7048310a9e153804c446881cd084dc864ebd" exitCode=0 Feb 27 06:52:16 crc kubenswrapper[4725]: I0227 06:52:16.641581 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" event={"ID":"501e41e3-55eb-4b62-b4b5-67f594761a64","Type":"ContainerDied","Data":"62f11a968f7846967e6d543d24de7048310a9e153804c446881cd084dc864ebd"} Feb 27 06:52:17 crc kubenswrapper[4725]: I0227 06:52:17.251896 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:52:17 crc kubenswrapper[4725]: E0227 06:52:17.252653 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.081669 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.200551 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-ssh-key-openstack-edpm-ipam\") pod \"501e41e3-55eb-4b62-b4b5-67f594761a64\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.200733 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-inventory\") pod \"501e41e3-55eb-4b62-b4b5-67f594761a64\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.200867 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvh4j\" (UniqueName: \"kubernetes.io/projected/501e41e3-55eb-4b62-b4b5-67f594761a64-kube-api-access-hvh4j\") pod \"501e41e3-55eb-4b62-b4b5-67f594761a64\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.200897 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-combined-ca-bundle\") pod \"501e41e3-55eb-4b62-b4b5-67f594761a64\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.201016 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-secret-0\") pod \"501e41e3-55eb-4b62-b4b5-67f594761a64\" (UID: \"501e41e3-55eb-4b62-b4b5-67f594761a64\") " Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.206093 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "501e41e3-55eb-4b62-b4b5-67f594761a64" (UID: "501e41e3-55eb-4b62-b4b5-67f594761a64"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.206645 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501e41e3-55eb-4b62-b4b5-67f594761a64-kube-api-access-hvh4j" (OuterVolumeSpecName: "kube-api-access-hvh4j") pod "501e41e3-55eb-4b62-b4b5-67f594761a64" (UID: "501e41e3-55eb-4b62-b4b5-67f594761a64"). InnerVolumeSpecName "kube-api-access-hvh4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.228744 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "501e41e3-55eb-4b62-b4b5-67f594761a64" (UID: "501e41e3-55eb-4b62-b4b5-67f594761a64"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.229871 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-inventory" (OuterVolumeSpecName: "inventory") pod "501e41e3-55eb-4b62-b4b5-67f594761a64" (UID: "501e41e3-55eb-4b62-b4b5-67f594761a64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.231162 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "501e41e3-55eb-4b62-b4b5-67f594761a64" (UID: "501e41e3-55eb-4b62-b4b5-67f594761a64"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.303425 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvh4j\" (UniqueName: \"kubernetes.io/projected/501e41e3-55eb-4b62-b4b5-67f594761a64-kube-api-access-hvh4j\") on node \"crc\" DevicePath \"\"" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.303461 4725 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.303471 4725 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.303480 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.303490 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501e41e3-55eb-4b62-b4b5-67f594761a64-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.672023 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" event={"ID":"501e41e3-55eb-4b62-b4b5-67f594761a64","Type":"ContainerDied","Data":"e2837fa0668f9ad97616cdc41fd470e438a4ba4fe635dfdfc2b700e22611aa55"} Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.672081 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2837fa0668f9ad97616cdc41fd470e438a4ba4fe635dfdfc2b700e22611aa55" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.672098 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.774542 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k"] Feb 27 06:52:18 crc kubenswrapper[4725]: E0227 06:52:18.779006 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501e41e3-55eb-4b62-b4b5-67f594761a64" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.779124 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="501e41e3-55eb-4b62-b4b5-67f594761a64" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 06:52:18 crc kubenswrapper[4725]: E0227 06:52:18.779196 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c30d4fe-9081-49a6-a7cf-368637b3fa0c" containerName="oc" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.779259 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c30d4fe-9081-49a6-a7cf-368637b3fa0c" containerName="oc" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.779621 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c30d4fe-9081-49a6-a7cf-368637b3fa0c" containerName="oc" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.779729 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="501e41e3-55eb-4b62-b4b5-67f594761a64" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.780677 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.785389 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.785587 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.785696 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.785967 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.786104 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.789449 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k"] Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.790490 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.790753 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921502 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921592 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921622 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921649 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921699 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921724 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921752 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921819 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921862 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t2zd\" (UniqueName: \"kubernetes.io/projected/8ac7b33c-a85a-436b-b4c1-560c074fab9b-kube-api-access-2t2zd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921896 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:18 crc kubenswrapper[4725]: I0227 06:52:18.921929 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.023896 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.023955 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.023987 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024048 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024077 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t2zd\" (UniqueName: \"kubernetes.io/projected/8ac7b33c-a85a-436b-b4c1-560c074fab9b-kube-api-access-2t2zd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024104 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024126 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024182 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024219 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024237 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.024256 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.027372 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.031995 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.035536 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.035631 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.036172 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.037152 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.045822 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.047171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.056566 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.059143 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t2zd\" (UniqueName: \"kubernetes.io/projected/8ac7b33c-a85a-436b-b4c1-560c074fab9b-kube-api-access-2t2zd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.059806 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wjx2k\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.099027 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:52:19 crc kubenswrapper[4725]: W0227 06:52:19.707768 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ac7b33c_a85a_436b_b4c1_560c074fab9b.slice/crio-fd33bc19cece3e6b3f2baa2f14b44f2b7951515f0df9e7163590177ff1a4b829 WatchSource:0}: Error finding container fd33bc19cece3e6b3f2baa2f14b44f2b7951515f0df9e7163590177ff1a4b829: Status 404 returned error can't find the container with id fd33bc19cece3e6b3f2baa2f14b44f2b7951515f0df9e7163590177ff1a4b829 Feb 27 06:52:19 crc kubenswrapper[4725]: I0227 06:52:19.757026 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k"] Feb 27 06:52:20 crc kubenswrapper[4725]: I0227 06:52:20.691029 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" event={"ID":"8ac7b33c-a85a-436b-b4c1-560c074fab9b","Type":"ContainerStarted","Data":"fd33bc19cece3e6b3f2baa2f14b44f2b7951515f0df9e7163590177ff1a4b829"} Feb 27 06:52:21 crc kubenswrapper[4725]: I0227 06:52:21.699566 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" event={"ID":"8ac7b33c-a85a-436b-b4c1-560c074fab9b","Type":"ContainerStarted","Data":"d1d66685f4eba0cda76c883f6e42275efd742986656181b7cb9c736d9a263c36"} Feb 27 06:52:21 crc kubenswrapper[4725]: I0227 06:52:21.724660 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" podStartSLOduration=2.897180994 podStartE2EDuration="3.724641491s" podCreationTimestamp="2026-02-27 06:52:18 +0000 UTC" firstStartedPulling="2026-02-27 06:52:19.710953239 +0000 UTC m=+2518.173573808" lastFinishedPulling="2026-02-27 06:52:20.538413716 +0000 UTC m=+2519.001034305" observedRunningTime="2026-02-27 06:52:21.720481144 +0000 UTC m=+2520.183101713" watchObservedRunningTime="2026-02-27 06:52:21.724641491 +0000 UTC m=+2520.187262060" Feb 27 06:52:29 crc kubenswrapper[4725]: I0227 06:52:29.251943 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:52:29 crc kubenswrapper[4725]: E0227 06:52:29.252710 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:52:43 crc kubenswrapper[4725]: I0227 06:52:43.251840 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:52:43 crc kubenswrapper[4725]: E0227 06:52:43.253006 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:52:57 crc kubenswrapper[4725]: I0227 06:52:57.252736 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:52:57 crc kubenswrapper[4725]: E0227 06:52:57.253690 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:53:03 crc kubenswrapper[4725]: I0227 06:53:03.975581 4725 scope.go:117] "RemoveContainer" containerID="e9f8dadc1599cc5f42da0d8b897acab3121df43bb5622a2de3bb267e28c858da" Feb 27 06:53:12 crc kubenswrapper[4725]: I0227 06:53:12.264512 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:53:13 crc kubenswrapper[4725]: I0227 06:53:13.271035 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"dc4f505db5c927b1115d9bb5cb4e6488533e4b86cbba3fd83d6f19a11a64f81b"} Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.139601 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536254-zmcfp"] Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.141696 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.143885 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.144049 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.144443 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.150732 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536254-zmcfp"] Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.295278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-989cg\" (UniqueName: \"kubernetes.io/projected/08cdf6e1-7c17-4514-9106-be74317e08b1-kube-api-access-989cg\") pod \"auto-csr-approver-29536254-zmcfp\" (UID: \"08cdf6e1-7c17-4514-9106-be74317e08b1\") " pod="openshift-infra/auto-csr-approver-29536254-zmcfp" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.396778 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-989cg\" (UniqueName: \"kubernetes.io/projected/08cdf6e1-7c17-4514-9106-be74317e08b1-kube-api-access-989cg\") pod \"auto-csr-approver-29536254-zmcfp\" (UID: \"08cdf6e1-7c17-4514-9106-be74317e08b1\") " pod="openshift-infra/auto-csr-approver-29536254-zmcfp" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.421306 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-989cg\" (UniqueName: \"kubernetes.io/projected/08cdf6e1-7c17-4514-9106-be74317e08b1-kube-api-access-989cg\") pod \"auto-csr-approver-29536254-zmcfp\" (UID: \"08cdf6e1-7c17-4514-9106-be74317e08b1\") " pod="openshift-infra/auto-csr-approver-29536254-zmcfp" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.469349 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.934670 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536254-zmcfp"] Feb 27 06:54:00 crc kubenswrapper[4725]: I0227 06:54:00.942333 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:54:01 crc kubenswrapper[4725]: I0227 06:54:01.730545 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" event={"ID":"08cdf6e1-7c17-4514-9106-be74317e08b1","Type":"ContainerStarted","Data":"18b67851d5c8a4d6093a7d7d82724591796060d03315791d31c1d712bfbc8712"} Feb 27 06:54:02 crc kubenswrapper[4725]: I0227 06:54:02.742280 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" event={"ID":"08cdf6e1-7c17-4514-9106-be74317e08b1","Type":"ContainerStarted","Data":"9e211eea55d7ea1ef8bff2b561ba8c249722e7ea7aaad8334d07f23ac4026325"} Feb 27 06:54:03 crc kubenswrapper[4725]: I0227 06:54:03.754188 4725 generic.go:334] "Generic (PLEG): container finished" podID="08cdf6e1-7c17-4514-9106-be74317e08b1" containerID="9e211eea55d7ea1ef8bff2b561ba8c249722e7ea7aaad8334d07f23ac4026325" exitCode=0 Feb 27 06:54:03 crc kubenswrapper[4725]: I0227 06:54:03.754232 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" event={"ID":"08cdf6e1-7c17-4514-9106-be74317e08b1","Type":"ContainerDied","Data":"9e211eea55d7ea1ef8bff2b561ba8c249722e7ea7aaad8334d07f23ac4026325"} Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.162530 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.308369 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-989cg\" (UniqueName: \"kubernetes.io/projected/08cdf6e1-7c17-4514-9106-be74317e08b1-kube-api-access-989cg\") pod \"08cdf6e1-7c17-4514-9106-be74317e08b1\" (UID: \"08cdf6e1-7c17-4514-9106-be74317e08b1\") " Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.318417 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08cdf6e1-7c17-4514-9106-be74317e08b1-kube-api-access-989cg" (OuterVolumeSpecName: "kube-api-access-989cg") pod "08cdf6e1-7c17-4514-9106-be74317e08b1" (UID: "08cdf6e1-7c17-4514-9106-be74317e08b1"). InnerVolumeSpecName "kube-api-access-989cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.346276 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536248-k6725"] Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.357608 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536248-k6725"] Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.413320 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-989cg\" (UniqueName: \"kubernetes.io/projected/08cdf6e1-7c17-4514-9106-be74317e08b1-kube-api-access-989cg\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.787775 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" event={"ID":"08cdf6e1-7c17-4514-9106-be74317e08b1","Type":"ContainerDied","Data":"18b67851d5c8a4d6093a7d7d82724591796060d03315791d31c1d712bfbc8712"} Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.787842 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b67851d5c8a4d6093a7d7d82724591796060d03315791d31c1d712bfbc8712" Feb 27 06:54:05 crc kubenswrapper[4725]: I0227 06:54:05.787847 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536254-zmcfp" Feb 27 06:54:06 crc kubenswrapper[4725]: I0227 06:54:06.270715 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84137321-69dc-4bf7-a4e8-4d3a3ff6600d" path="/var/lib/kubelet/pods/84137321-69dc-4bf7-a4e8-4d3a3ff6600d/volumes" Feb 27 06:54:46 crc kubenswrapper[4725]: I0227 06:54:46.216872 4725 generic.go:334] "Generic (PLEG): container finished" podID="8ac7b33c-a85a-436b-b4c1-560c074fab9b" containerID="d1d66685f4eba0cda76c883f6e42275efd742986656181b7cb9c736d9a263c36" exitCode=0 Feb 27 06:54:46 crc kubenswrapper[4725]: I0227 06:54:46.216979 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" event={"ID":"8ac7b33c-a85a-436b-b4c1-560c074fab9b","Type":"ContainerDied","Data":"d1d66685f4eba0cda76c883f6e42275efd742986656181b7cb9c736d9a263c36"} Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.834855 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973440 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-extra-config-0\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973547 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-ssh-key-openstack-edpm-ipam\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973569 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t2zd\" (UniqueName: \"kubernetes.io/projected/8ac7b33c-a85a-436b-b4c1-560c074fab9b-kube-api-access-2t2zd\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973628 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-0\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973695 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-2\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973718 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-inventory\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973784 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-0\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973823 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-combined-ca-bundle\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973882 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-3\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973915 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-1\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.973996 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-1\") pod \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\" (UID: \"8ac7b33c-a85a-436b-b4c1-560c074fab9b\") " Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.979220 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac7b33c-a85a-436b-b4c1-560c074fab9b-kube-api-access-2t2zd" (OuterVolumeSpecName: "kube-api-access-2t2zd") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "kube-api-access-2t2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:54:47 crc kubenswrapper[4725]: I0227 06:54:47.988458 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.010438 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.010944 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-inventory" (OuterVolumeSpecName: "inventory") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.015220 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.015533 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.017243 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.024823 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.026548 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.036483 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.046311 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "8ac7b33c-a85a-436b-b4c1-560c074fab9b" (UID: "8ac7b33c-a85a-436b-b4c1-560c074fab9b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.079976 4725 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080389 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080408 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t2zd\" (UniqueName: \"kubernetes.io/projected/8ac7b33c-a85a-436b-b4c1-560c074fab9b-kube-api-access-2t2zd\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080421 4725 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080435 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080448 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080459 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080514 4725 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080529 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080541 4725 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.080557 4725 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8ac7b33c-a85a-436b-b4c1-560c074fab9b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.241919 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" event={"ID":"8ac7b33c-a85a-436b-b4c1-560c074fab9b","Type":"ContainerDied","Data":"fd33bc19cece3e6b3f2baa2f14b44f2b7951515f0df9e7163590177ff1a4b829"} Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.242180 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd33bc19cece3e6b3f2baa2f14b44f2b7951515f0df9e7163590177ff1a4b829" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.241954 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wjx2k" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.365107 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph"] Feb 27 06:54:48 crc kubenswrapper[4725]: E0227 06:54:48.365853 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08cdf6e1-7c17-4514-9106-be74317e08b1" containerName="oc" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.365943 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cdf6e1-7c17-4514-9106-be74317e08b1" containerName="oc" Feb 27 06:54:48 crc kubenswrapper[4725]: E0227 06:54:48.366054 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac7b33c-a85a-436b-b4c1-560c074fab9b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.366142 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac7b33c-a85a-436b-b4c1-560c074fab9b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.366535 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac7b33c-a85a-436b-b4c1-560c074fab9b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.366645 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="08cdf6e1-7c17-4514-9106-be74317e08b1" containerName="oc" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.367572 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.375480 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.375771 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.375784 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph"] Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.376210 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.376332 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-msxnl" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.376516 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.488955 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.489202 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.489275 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.489378 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.489527 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.489581 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.489638 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g25wx\" (UniqueName: \"kubernetes.io/projected/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-kube-api-access-g25wx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.591932 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.592000 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.592091 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.592201 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.592249 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.592325 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g25wx\" (UniqueName: \"kubernetes.io/projected/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-kube-api-access-g25wx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.592465 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.596660 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.597161 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.597637 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.597900 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.599636 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.599706 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.609602 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g25wx\" (UniqueName: \"kubernetes.io/projected/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-kube-api-access-g25wx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:48 crc kubenswrapper[4725]: I0227 06:54:48.694863 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:54:49 crc kubenswrapper[4725]: I0227 06:54:49.283032 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph"] Feb 27 06:54:50 crc kubenswrapper[4725]: I0227 06:54:50.267617 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" event={"ID":"9898977d-f2bf-4be4-9b90-82fbcc11ba8b","Type":"ContainerStarted","Data":"1d52a19bbba113ccc275c41b069328e44540e19ea97368bedb0a353471b9a5f8"} Feb 27 06:54:50 crc kubenswrapper[4725]: I0227 06:54:50.267910 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" event={"ID":"9898977d-f2bf-4be4-9b90-82fbcc11ba8b","Type":"ContainerStarted","Data":"349e978ae67377e05ff122aa5373650887a614be825f1d62046fb767964f928b"} Feb 27 06:54:50 crc kubenswrapper[4725]: I0227 06:54:50.296074 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" podStartSLOduration=1.77199084 podStartE2EDuration="2.296043316s" podCreationTimestamp="2026-02-27 06:54:48 +0000 UTC" firstStartedPulling="2026-02-27 06:54:49.292974047 +0000 UTC m=+2667.755594636" lastFinishedPulling="2026-02-27 06:54:49.817026543 +0000 UTC m=+2668.279647112" observedRunningTime="2026-02-27 06:54:50.292793374 +0000 UTC m=+2668.755413943" watchObservedRunningTime="2026-02-27 06:54:50.296043316 +0000 UTC m=+2668.758663895" Feb 27 06:55:04 crc kubenswrapper[4725]: I0227 06:55:04.068586 4725 scope.go:117] "RemoveContainer" containerID="fc3db8e49e12cb8d6a3cc22a6ec83548d4bf4164202ceedd9e425f14c8d46291" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.201643 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4f8dx"] Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.204442 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.217416 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f8dx"] Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.386589 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-catalog-content\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.386681 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-utilities\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.386726 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntlfk\" (UniqueName: \"kubernetes.io/projected/17b96904-2704-4fa1-b6a0-678ff1ce1899-kube-api-access-ntlfk\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.488974 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-catalog-content\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.489078 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-utilities\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.489129 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntlfk\" (UniqueName: \"kubernetes.io/projected/17b96904-2704-4fa1-b6a0-678ff1ce1899-kube-api-access-ntlfk\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.489648 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-catalog-content\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.489678 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-utilities\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.511122 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntlfk\" (UniqueName: \"kubernetes.io/projected/17b96904-2704-4fa1-b6a0-678ff1ce1899-kube-api-access-ntlfk\") pod \"redhat-marketplace-4f8dx\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:07 crc kubenswrapper[4725]: I0227 06:55:07.523614 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:08 crc kubenswrapper[4725]: I0227 06:55:08.094819 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f8dx"] Feb 27 06:55:08 crc kubenswrapper[4725]: W0227 06:55:08.095941 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b96904_2704_4fa1_b6a0_678ff1ce1899.slice/crio-068d2ca2261a9cfd29c86945798b896011f78308782c4f05e1cee2b727c0cb92 WatchSource:0}: Error finding container 068d2ca2261a9cfd29c86945798b896011f78308782c4f05e1cee2b727c0cb92: Status 404 returned error can't find the container with id 068d2ca2261a9cfd29c86945798b896011f78308782c4f05e1cee2b727c0cb92 Feb 27 06:55:08 crc kubenswrapper[4725]: I0227 06:55:08.432185 4725 generic.go:334] "Generic (PLEG): container finished" podID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerID="7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b" exitCode=0 Feb 27 06:55:08 crc kubenswrapper[4725]: I0227 06:55:08.432263 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f8dx" event={"ID":"17b96904-2704-4fa1-b6a0-678ff1ce1899","Type":"ContainerDied","Data":"7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b"} Feb 27 06:55:08 crc kubenswrapper[4725]: I0227 06:55:08.432527 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f8dx" event={"ID":"17b96904-2704-4fa1-b6a0-678ff1ce1899","Type":"ContainerStarted","Data":"068d2ca2261a9cfd29c86945798b896011f78308782c4f05e1cee2b727c0cb92"} Feb 27 06:55:10 crc kubenswrapper[4725]: I0227 06:55:10.453236 4725 generic.go:334] "Generic (PLEG): container finished" podID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerID="b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044" exitCode=0 Feb 27 06:55:10 crc kubenswrapper[4725]: I0227 06:55:10.453331 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f8dx" event={"ID":"17b96904-2704-4fa1-b6a0-678ff1ce1899","Type":"ContainerDied","Data":"b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044"} Feb 27 06:55:11 crc kubenswrapper[4725]: I0227 06:55:11.465358 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f8dx" event={"ID":"17b96904-2704-4fa1-b6a0-678ff1ce1899","Type":"ContainerStarted","Data":"ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1"} Feb 27 06:55:11 crc kubenswrapper[4725]: I0227 06:55:11.492932 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4f8dx" podStartSLOduration=2.079989238 podStartE2EDuration="4.49290543s" podCreationTimestamp="2026-02-27 06:55:07 +0000 UTC" firstStartedPulling="2026-02-27 06:55:08.434391807 +0000 UTC m=+2686.897012396" lastFinishedPulling="2026-02-27 06:55:10.847308019 +0000 UTC m=+2689.309928588" observedRunningTime="2026-02-27 06:55:11.486248822 +0000 UTC m=+2689.948869401" watchObservedRunningTime="2026-02-27 06:55:11.49290543 +0000 UTC m=+2689.955525999" Feb 27 06:55:17 crc kubenswrapper[4725]: I0227 06:55:17.524042 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:17 crc kubenswrapper[4725]: I0227 06:55:17.524343 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:17 crc kubenswrapper[4725]: I0227 06:55:17.567502 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:18 crc kubenswrapper[4725]: I0227 06:55:18.592822 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:18 crc kubenswrapper[4725]: I0227 06:55:18.656138 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f8dx"] Feb 27 06:55:20 crc kubenswrapper[4725]: I0227 06:55:20.554748 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4f8dx" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="registry-server" containerID="cri-o://ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1" gracePeriod=2 Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.049479 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.180800 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-catalog-content\") pod \"17b96904-2704-4fa1-b6a0-678ff1ce1899\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.180956 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntlfk\" (UniqueName: \"kubernetes.io/projected/17b96904-2704-4fa1-b6a0-678ff1ce1899-kube-api-access-ntlfk\") pod \"17b96904-2704-4fa1-b6a0-678ff1ce1899\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.181144 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-utilities\") pod \"17b96904-2704-4fa1-b6a0-678ff1ce1899\" (UID: \"17b96904-2704-4fa1-b6a0-678ff1ce1899\") " Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.181923 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-utilities" (OuterVolumeSpecName: "utilities") pod "17b96904-2704-4fa1-b6a0-678ff1ce1899" (UID: "17b96904-2704-4fa1-b6a0-678ff1ce1899"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.189447 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b96904-2704-4fa1-b6a0-678ff1ce1899-kube-api-access-ntlfk" (OuterVolumeSpecName: "kube-api-access-ntlfk") pod "17b96904-2704-4fa1-b6a0-678ff1ce1899" (UID: "17b96904-2704-4fa1-b6a0-678ff1ce1899"). InnerVolumeSpecName "kube-api-access-ntlfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.204337 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17b96904-2704-4fa1-b6a0-678ff1ce1899" (UID: "17b96904-2704-4fa1-b6a0-678ff1ce1899"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.284297 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntlfk\" (UniqueName: \"kubernetes.io/projected/17b96904-2704-4fa1-b6a0-678ff1ce1899-kube-api-access-ntlfk\") on node \"crc\" DevicePath \"\"" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.284351 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.284364 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b96904-2704-4fa1-b6a0-678ff1ce1899-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.564411 4725 generic.go:334] "Generic (PLEG): container finished" podID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerID="ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1" exitCode=0 Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.564581 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f8dx" event={"ID":"17b96904-2704-4fa1-b6a0-678ff1ce1899","Type":"ContainerDied","Data":"ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1"} Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.564729 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4f8dx" event={"ID":"17b96904-2704-4fa1-b6a0-678ff1ce1899","Type":"ContainerDied","Data":"068d2ca2261a9cfd29c86945798b896011f78308782c4f05e1cee2b727c0cb92"} Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.564751 4725 scope.go:117] "RemoveContainer" containerID="ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.564677 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4f8dx" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.582934 4725 scope.go:117] "RemoveContainer" containerID="b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.608171 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f8dx"] Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.613700 4725 scope.go:117] "RemoveContainer" containerID="7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.619328 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4f8dx"] Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.668652 4725 scope.go:117] "RemoveContainer" containerID="ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1" Feb 27 06:55:21 crc kubenswrapper[4725]: E0227 06:55:21.669673 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1\": container with ID starting with ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1 not found: ID does not exist" containerID="ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.669717 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1"} err="failed to get container status \"ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1\": rpc error: code = NotFound desc = could not find container \"ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1\": container with ID starting with ad5902c88dedf653fd4fc26239c4525670d0b808a2f180e0512f218b0e1d5aa1 not found: ID does not exist" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.669741 4725 scope.go:117] "RemoveContainer" containerID="b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044" Feb 27 06:55:21 crc kubenswrapper[4725]: E0227 06:55:21.670087 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044\": container with ID starting with b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044 not found: ID does not exist" containerID="b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.670110 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044"} err="failed to get container status \"b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044\": rpc error: code = NotFound desc = could not find container \"b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044\": container with ID starting with b8e78a2b4054eb38aba5f86de2cc17617b6164091b390a34a748318622f75044 not found: ID does not exist" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.670125 4725 scope.go:117] "RemoveContainer" containerID="7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b" Feb 27 06:55:21 crc kubenswrapper[4725]: E0227 06:55:21.670425 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b\": container with ID starting with 7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b not found: ID does not exist" containerID="7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b" Feb 27 06:55:21 crc kubenswrapper[4725]: I0227 06:55:21.670443 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b"} err="failed to get container status \"7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b\": rpc error: code = NotFound desc = could not find container \"7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b\": container with ID starting with 7827c1460cefd9712aa6240785a25178165f847552b835d13a6988ebd795885b not found: ID does not exist" Feb 27 06:55:22 crc kubenswrapper[4725]: I0227 06:55:22.265715 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" path="/var/lib/kubelet/pods/17b96904-2704-4fa1-b6a0-678ff1ce1899/volumes" Feb 27 06:55:32 crc kubenswrapper[4725]: I0227 06:55:32.555005 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:55:32 crc kubenswrapper[4725]: I0227 06:55:32.555544 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.143944 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536256-kxtdc"] Feb 27 06:56:00 crc kubenswrapper[4725]: E0227 06:56:00.145454 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="registry-server" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.145476 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="registry-server" Feb 27 06:56:00 crc kubenswrapper[4725]: E0227 06:56:00.145497 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="extract-content" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.145506 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="extract-content" Feb 27 06:56:00 crc kubenswrapper[4725]: E0227 06:56:00.145519 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="extract-utilities" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.145528 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="extract-utilities" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.145790 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b96904-2704-4fa1-b6a0-678ff1ce1899" containerName="registry-server" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.146665 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536256-kxtdc" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.149259 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.150317 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.153438 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.160395 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536256-kxtdc"] Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.228046 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k8fh\" (UniqueName: \"kubernetes.io/projected/4784fd46-fab3-4706-b4e9-53818d8889e4-kube-api-access-8k8fh\") pod \"auto-csr-approver-29536256-kxtdc\" (UID: \"4784fd46-fab3-4706-b4e9-53818d8889e4\") " pod="openshift-infra/auto-csr-approver-29536256-kxtdc" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.330559 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k8fh\" (UniqueName: \"kubernetes.io/projected/4784fd46-fab3-4706-b4e9-53818d8889e4-kube-api-access-8k8fh\") pod \"auto-csr-approver-29536256-kxtdc\" (UID: \"4784fd46-fab3-4706-b4e9-53818d8889e4\") " pod="openshift-infra/auto-csr-approver-29536256-kxtdc" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.349703 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k8fh\" (UniqueName: \"kubernetes.io/projected/4784fd46-fab3-4706-b4e9-53818d8889e4-kube-api-access-8k8fh\") pod \"auto-csr-approver-29536256-kxtdc\" (UID: \"4784fd46-fab3-4706-b4e9-53818d8889e4\") " pod="openshift-infra/auto-csr-approver-29536256-kxtdc" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.467763 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536256-kxtdc" Feb 27 06:56:00 crc kubenswrapper[4725]: I0227 06:56:00.909433 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536256-kxtdc"] Feb 27 06:56:01 crc kubenswrapper[4725]: I0227 06:56:01.080068 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536256-kxtdc" event={"ID":"4784fd46-fab3-4706-b4e9-53818d8889e4","Type":"ContainerStarted","Data":"88f15b2462d5b0a09ddb5dc169586933ae62ba1252a7c442797af9b792850706"} Feb 27 06:56:02 crc kubenswrapper[4725]: I0227 06:56:02.555408 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:56:02 crc kubenswrapper[4725]: I0227 06:56:02.555808 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:56:03 crc kubenswrapper[4725]: I0227 06:56:03.103709 4725 generic.go:334] "Generic (PLEG): container finished" podID="4784fd46-fab3-4706-b4e9-53818d8889e4" containerID="185b997d2e59908445f218b179e89cc851877c18a9f3af276e1a8c5ffd92659b" exitCode=0 Feb 27 06:56:03 crc kubenswrapper[4725]: I0227 06:56:03.103777 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536256-kxtdc" event={"ID":"4784fd46-fab3-4706-b4e9-53818d8889e4","Type":"ContainerDied","Data":"185b997d2e59908445f218b179e89cc851877c18a9f3af276e1a8c5ffd92659b"} Feb 27 06:56:04 crc kubenswrapper[4725]: I0227 06:56:04.555534 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536256-kxtdc" Feb 27 06:56:04 crc kubenswrapper[4725]: I0227 06:56:04.725574 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k8fh\" (UniqueName: \"kubernetes.io/projected/4784fd46-fab3-4706-b4e9-53818d8889e4-kube-api-access-8k8fh\") pod \"4784fd46-fab3-4706-b4e9-53818d8889e4\" (UID: \"4784fd46-fab3-4706-b4e9-53818d8889e4\") " Feb 27 06:56:04 crc kubenswrapper[4725]: I0227 06:56:04.735552 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4784fd46-fab3-4706-b4e9-53818d8889e4-kube-api-access-8k8fh" (OuterVolumeSpecName: "kube-api-access-8k8fh") pod "4784fd46-fab3-4706-b4e9-53818d8889e4" (UID: "4784fd46-fab3-4706-b4e9-53818d8889e4"). InnerVolumeSpecName "kube-api-access-8k8fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:56:04 crc kubenswrapper[4725]: I0227 06:56:04.827500 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k8fh\" (UniqueName: \"kubernetes.io/projected/4784fd46-fab3-4706-b4e9-53818d8889e4-kube-api-access-8k8fh\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:05 crc kubenswrapper[4725]: I0227 06:56:05.123717 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536256-kxtdc" event={"ID":"4784fd46-fab3-4706-b4e9-53818d8889e4","Type":"ContainerDied","Data":"88f15b2462d5b0a09ddb5dc169586933ae62ba1252a7c442797af9b792850706"} Feb 27 06:56:05 crc kubenswrapper[4725]: I0227 06:56:05.123758 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536256-kxtdc" Feb 27 06:56:05 crc kubenswrapper[4725]: I0227 06:56:05.123771 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f15b2462d5b0a09ddb5dc169586933ae62ba1252a7c442797af9b792850706" Feb 27 06:56:05 crc kubenswrapper[4725]: I0227 06:56:05.636938 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536250-lfdqn"] Feb 27 06:56:05 crc kubenswrapper[4725]: I0227 06:56:05.645207 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536250-lfdqn"] Feb 27 06:56:06 crc kubenswrapper[4725]: I0227 06:56:06.264968 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf383bb-c952-4cbf-8f99-ffee8e8614f0" path="/var/lib/kubelet/pods/0bf383bb-c952-4cbf-8f99-ffee8e8614f0/volumes" Feb 27 06:56:32 crc kubenswrapper[4725]: I0227 06:56:32.554634 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:56:32 crc kubenswrapper[4725]: I0227 06:56:32.555273 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:56:32 crc kubenswrapper[4725]: I0227 06:56:32.555428 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:56:32 crc kubenswrapper[4725]: I0227 06:56:32.556530 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc4f505db5c927b1115d9bb5cb4e6488533e4b86cbba3fd83d6f19a11a64f81b"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:56:32 crc kubenswrapper[4725]: I0227 06:56:32.556633 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://dc4f505db5c927b1115d9bb5cb4e6488533e4b86cbba3fd83d6f19a11a64f81b" gracePeriod=600 Feb 27 06:56:33 crc kubenswrapper[4725]: I0227 06:56:33.427560 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="dc4f505db5c927b1115d9bb5cb4e6488533e4b86cbba3fd83d6f19a11a64f81b" exitCode=0 Feb 27 06:56:33 crc kubenswrapper[4725]: I0227 06:56:33.427613 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"dc4f505db5c927b1115d9bb5cb4e6488533e4b86cbba3fd83d6f19a11a64f81b"} Feb 27 06:56:33 crc kubenswrapper[4725]: I0227 06:56:33.428085 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa"} Feb 27 06:56:33 crc kubenswrapper[4725]: I0227 06:56:33.428111 4725 scope.go:117] "RemoveContainer" containerID="fb84fde5f2fd214392b908a1c9f7166cd2980fab9f935b48cf7d5d14f61fcba5" Feb 27 06:56:45 crc kubenswrapper[4725]: I0227 06:56:45.549528 4725 generic.go:334] "Generic (PLEG): container finished" podID="9898977d-f2bf-4be4-9b90-82fbcc11ba8b" containerID="1d52a19bbba113ccc275c41b069328e44540e19ea97368bedb0a353471b9a5f8" exitCode=0 Feb 27 06:56:45 crc kubenswrapper[4725]: I0227 06:56:45.549584 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" event={"ID":"9898977d-f2bf-4be4-9b90-82fbcc11ba8b","Type":"ContainerDied","Data":"1d52a19bbba113ccc275c41b069328e44540e19ea97368bedb0a353471b9a5f8"} Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.054874 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.228032 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-inventory\") pod \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.228179 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-2\") pod \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.228261 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g25wx\" (UniqueName: \"kubernetes.io/projected/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-kube-api-access-g25wx\") pod \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.228389 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-0\") pod \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.228413 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-telemetry-combined-ca-bundle\") pod \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.228456 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-1\") pod \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.228485 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ssh-key-openstack-edpm-ipam\") pod \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\" (UID: \"9898977d-f2bf-4be4-9b90-82fbcc11ba8b\") " Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.234528 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9898977d-f2bf-4be4-9b90-82fbcc11ba8b" (UID: "9898977d-f2bf-4be4-9b90-82fbcc11ba8b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.254843 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-kube-api-access-g25wx" (OuterVolumeSpecName: "kube-api-access-g25wx") pod "9898977d-f2bf-4be4-9b90-82fbcc11ba8b" (UID: "9898977d-f2bf-4be4-9b90-82fbcc11ba8b"). InnerVolumeSpecName "kube-api-access-g25wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.260576 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9898977d-f2bf-4be4-9b90-82fbcc11ba8b" (UID: "9898977d-f2bf-4be4-9b90-82fbcc11ba8b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.266737 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9898977d-f2bf-4be4-9b90-82fbcc11ba8b" (UID: "9898977d-f2bf-4be4-9b90-82fbcc11ba8b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.267454 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9898977d-f2bf-4be4-9b90-82fbcc11ba8b" (UID: "9898977d-f2bf-4be4-9b90-82fbcc11ba8b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.289781 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9898977d-f2bf-4be4-9b90-82fbcc11ba8b" (UID: "9898977d-f2bf-4be4-9b90-82fbcc11ba8b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.294632 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-inventory" (OuterVolumeSpecName: "inventory") pod "9898977d-f2bf-4be4-9b90-82fbcc11ba8b" (UID: "9898977d-f2bf-4be4-9b90-82fbcc11ba8b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.331219 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.331264 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g25wx\" (UniqueName: \"kubernetes.io/projected/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-kube-api-access-g25wx\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.331281 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.331326 4725 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.331340 4725 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.331353 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.331365 4725 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9898977d-f2bf-4be4-9b90-82fbcc11ba8b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.573215 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" event={"ID":"9898977d-f2bf-4be4-9b90-82fbcc11ba8b","Type":"ContainerDied","Data":"349e978ae67377e05ff122aa5373650887a614be825f1d62046fb767964f928b"} Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.573266 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph" Feb 27 06:56:47 crc kubenswrapper[4725]: I0227 06:56:47.573273 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="349e978ae67377e05ff122aa5373650887a614be825f1d62046fb767964f928b" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.612884 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xp647"] Feb 27 06:56:56 crc kubenswrapper[4725]: E0227 06:56:56.613787 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4784fd46-fab3-4706-b4e9-53818d8889e4" containerName="oc" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.613799 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4784fd46-fab3-4706-b4e9-53818d8889e4" containerName="oc" Feb 27 06:56:56 crc kubenswrapper[4725]: E0227 06:56:56.613808 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9898977d-f2bf-4be4-9b90-82fbcc11ba8b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.613815 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9898977d-f2bf-4be4-9b90-82fbcc11ba8b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.613978 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9898977d-f2bf-4be4-9b90-82fbcc11ba8b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.613994 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4784fd46-fab3-4706-b4e9-53818d8889e4" containerName="oc" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.619261 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.663438 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xp647"] Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.747145 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-catalog-content\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.747432 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-utilities\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.747456 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgphp\" (UniqueName: \"kubernetes.io/projected/47ccf802-5d61-4bb1-a794-d9601e42689e-kube-api-access-xgphp\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.848852 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-catalog-content\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.848929 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-utilities\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.848963 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgphp\" (UniqueName: \"kubernetes.io/projected/47ccf802-5d61-4bb1-a794-d9601e42689e-kube-api-access-xgphp\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.849486 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-utilities\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.849816 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-catalog-content\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.869956 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgphp\" (UniqueName: \"kubernetes.io/projected/47ccf802-5d61-4bb1-a794-d9601e42689e-kube-api-access-xgphp\") pod \"redhat-operators-xp647\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:56 crc kubenswrapper[4725]: I0227 06:56:56.975239 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:56:57 crc kubenswrapper[4725]: I0227 06:56:57.439664 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xp647"] Feb 27 06:56:57 crc kubenswrapper[4725]: I0227 06:56:57.687684 4725 generic.go:334] "Generic (PLEG): container finished" podID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerID="fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d" exitCode=0 Feb 27 06:56:57 crc kubenswrapper[4725]: I0227 06:56:57.687787 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp647" event={"ID":"47ccf802-5d61-4bb1-a794-d9601e42689e","Type":"ContainerDied","Data":"fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d"} Feb 27 06:56:57 crc kubenswrapper[4725]: I0227 06:56:57.687993 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp647" event={"ID":"47ccf802-5d61-4bb1-a794-d9601e42689e","Type":"ContainerStarted","Data":"0052968cc9c0e7dd4d740b8587b00215d46c01e1801431d5047a87b172fddb4d"} Feb 27 06:56:58 crc kubenswrapper[4725]: I0227 06:56:58.699436 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp647" event={"ID":"47ccf802-5d61-4bb1-a794-d9601e42689e","Type":"ContainerStarted","Data":"c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b"} Feb 27 06:57:01 crc kubenswrapper[4725]: I0227 06:57:01.728398 4725 generic.go:334] "Generic (PLEG): container finished" podID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerID="c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b" exitCode=0 Feb 27 06:57:01 crc kubenswrapper[4725]: I0227 06:57:01.728474 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp647" event={"ID":"47ccf802-5d61-4bb1-a794-d9601e42689e","Type":"ContainerDied","Data":"c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b"} Feb 27 06:57:02 crc kubenswrapper[4725]: I0227 06:57:02.740128 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp647" event={"ID":"47ccf802-5d61-4bb1-a794-d9601e42689e","Type":"ContainerStarted","Data":"2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c"} Feb 27 06:57:02 crc kubenswrapper[4725]: I0227 06:57:02.772676 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xp647" podStartSLOduration=2.2711533250000002 podStartE2EDuration="6.772648135s" podCreationTimestamp="2026-02-27 06:56:56 +0000 UTC" firstStartedPulling="2026-02-27 06:56:57.689841032 +0000 UTC m=+2796.152461601" lastFinishedPulling="2026-02-27 06:57:02.191335802 +0000 UTC m=+2800.653956411" observedRunningTime="2026-02-27 06:57:02.765886924 +0000 UTC m=+2801.228507493" watchObservedRunningTime="2026-02-27 06:57:02.772648135 +0000 UTC m=+2801.235268784" Feb 27 06:57:04 crc kubenswrapper[4725]: I0227 06:57:04.277079 4725 scope.go:117] "RemoveContainer" containerID="fecf89a29d64d1c5443a1075c6ce7911dd793143dafe7b3d403df22531ed1955" Feb 27 06:57:06 crc kubenswrapper[4725]: I0227 06:57:06.976300 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:57:06 crc kubenswrapper[4725]: I0227 06:57:06.976912 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:57:08 crc kubenswrapper[4725]: I0227 06:57:08.027380 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xp647" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="registry-server" probeResult="failure" output=< Feb 27 06:57:08 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:57:08 crc kubenswrapper[4725]: > Feb 27 06:57:17 crc kubenswrapper[4725]: I0227 06:57:17.031992 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:57:17 crc kubenswrapper[4725]: I0227 06:57:17.122498 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:57:17 crc kubenswrapper[4725]: I0227 06:57:17.278649 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xp647"] Feb 27 06:57:18 crc kubenswrapper[4725]: I0227 06:57:18.916395 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xp647" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="registry-server" containerID="cri-o://2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c" gracePeriod=2 Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.274983 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.276630 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.280512 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.336849 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436081 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436166 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-config-data\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436191 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436305 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-lib-modules\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436331 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9d6m\" (UniqueName: \"kubernetes.io/projected/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-kube-api-access-q9d6m\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436410 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436437 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436463 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-run\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436482 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436553 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436639 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.436673 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-dev\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.441969 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.444148 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.449001 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.462120 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-sys\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.462208 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-scripts\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.462300 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.492461 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.559896 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.561694 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564545 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564582 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564601 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmxh\" (UniqueName: \"kubernetes.io/projected/ba6ad5a5-a980-46a3-8891-5448144c7885-kube-api-access-phmxh\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564619 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564642 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564659 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564681 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-lib-modules\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564701 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9d6m\" (UniqueName: \"kubernetes.io/projected/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-kube-api-access-q9d6m\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564735 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564751 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564772 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564792 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564810 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-run\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564825 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564843 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-sys\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564865 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564882 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564926 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564948 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564967 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.564983 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-dev\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-run\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565028 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-sys\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565050 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-scripts\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565080 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565095 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-dev\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565113 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565133 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565160 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-config-data\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565175 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565304 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565596 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565607 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565636 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-dev\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565657 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-lib-modules\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565659 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-sys\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.565784 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.567529 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.568854 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-run\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.584243 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-config-data\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.586836 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-scripts\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.591153 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.594571 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.594749 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.596299 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.603963 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.613140 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9d6m\" (UniqueName: \"kubernetes.io/projected/1c1a66bf-70db-4738-ae7d-4fd930ec4f4d-kube-api-access-q9d6m\") pod \"cinder-backup-0\" (UID: \"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d\") " pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.629006 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.666657 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.666737 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.666762 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667216 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667332 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667387 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667418 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-sys\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667435 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667438 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667482 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-sys\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667651 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667774 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667828 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667906 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667966 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.667992 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668043 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-run\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668087 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668103 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668173 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668212 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668230 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z8kv\" (UniqueName: \"kubernetes.io/projected/be6347d4-c8d8-416d-9229-9671f6a027d4-kube-api-access-6z8kv\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668340 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668363 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-dev\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668462 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668535 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668551 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668580 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668605 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmxh\" (UniqueName: \"kubernetes.io/projected/ba6ad5a5-a980-46a3-8891-5448144c7885-kube-api-access-phmxh\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668629 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668679 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668702 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668757 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668904 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.668926 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-dev\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.670536 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-run\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.670928 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.670977 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.670990 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.671014 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.671058 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ba6ad5a5-a980-46a3-8891-5448144c7885-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.671918 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.671939 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.674116 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba6ad5a5-a980-46a3-8891-5448144c7885-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.688594 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmxh\" (UniqueName: \"kubernetes.io/projected/ba6ad5a5-a980-46a3-8891-5448144c7885-kube-api-access-phmxh\") pod \"cinder-volume-nfs-0\" (UID: \"ba6ad5a5-a980-46a3-8891-5448144c7885\") " pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.757653 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.770704 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgphp\" (UniqueName: \"kubernetes.io/projected/47ccf802-5d61-4bb1-a794-d9601e42689e-kube-api-access-xgphp\") pod \"47ccf802-5d61-4bb1-a794-d9601e42689e\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.770923 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-catalog-content\") pod \"47ccf802-5d61-4bb1-a794-d9601e42689e\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.770983 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-utilities\") pod \"47ccf802-5d61-4bb1-a794-d9601e42689e\" (UID: \"47ccf802-5d61-4bb1-a794-d9601e42689e\") " Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.771380 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.771444 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.771469 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.771498 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.771519 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z8kv\" (UniqueName: \"kubernetes.io/projected/be6347d4-c8d8-416d-9229-9671f6a027d4-kube-api-access-6z8kv\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.771588 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.771640 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.775677 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.776842 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-utilities" (OuterVolumeSpecName: "utilities") pod "47ccf802-5d61-4bb1-a794-d9601e42689e" (UID: "47ccf802-5d61-4bb1-a794-d9601e42689e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.777931 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.777971 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.777989 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778014 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778043 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778141 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778216 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778363 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778406 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778434 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778474 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778495 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778580 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778612 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778713 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778749 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778800 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.778929 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be6347d4-c8d8-416d-9229-9671f6a027d4-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.784417 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.784419 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.798023 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ccf802-5d61-4bb1-a794-d9601e42689e-kube-api-access-xgphp" (OuterVolumeSpecName: "kube-api-access-xgphp") pod "47ccf802-5d61-4bb1-a794-d9601e42689e" (UID: "47ccf802-5d61-4bb1-a794-d9601e42689e"). InnerVolumeSpecName "kube-api-access-xgphp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.799718 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.801598 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z8kv\" (UniqueName: \"kubernetes.io/projected/be6347d4-c8d8-416d-9229-9671f6a027d4-kube-api-access-6z8kv\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.807980 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be6347d4-c8d8-416d-9229-9671f6a027d4-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"be6347d4-c8d8-416d-9229-9671f6a027d4\") " pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.824138 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.880567 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgphp\" (UniqueName: \"kubernetes.io/projected/47ccf802-5d61-4bb1-a794-d9601e42689e-kube-api-access-xgphp\") on node \"crc\" DevicePath \"\"" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.930037 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47ccf802-5d61-4bb1-a794-d9601e42689e" (UID: "47ccf802-5d61-4bb1-a794-d9601e42689e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.965281 4725 generic.go:334] "Generic (PLEG): container finished" podID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerID="2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c" exitCode=0 Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.965372 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp647" event={"ID":"47ccf802-5d61-4bb1-a794-d9601e42689e","Type":"ContainerDied","Data":"2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c"} Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.965427 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xp647" event={"ID":"47ccf802-5d61-4bb1-a794-d9601e42689e","Type":"ContainerDied","Data":"0052968cc9c0e7dd4d740b8587b00215d46c01e1801431d5047a87b172fddb4d"} Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.965455 4725 scope.go:117] "RemoveContainer" containerID="2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.965775 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xp647" Feb 27 06:57:19 crc kubenswrapper[4725]: I0227 06:57:19.982225 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ccf802-5d61-4bb1-a794-d9601e42689e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.021333 4725 scope.go:117] "RemoveContainer" containerID="c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.021651 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xp647"] Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.041883 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xp647"] Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.045439 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.247354 4725 scope.go:117] "RemoveContainer" containerID="fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.270654 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" path="/var/lib/kubelet/pods/47ccf802-5d61-4bb1-a794-d9601e42689e/volumes" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.278515 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.298239 4725 scope.go:117] "RemoveContainer" containerID="2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c" Feb 27 06:57:20 crc kubenswrapper[4725]: E0227 06:57:20.299041 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c\": container with ID starting with 2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c not found: ID does not exist" containerID="2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.299073 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c"} err="failed to get container status \"2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c\": rpc error: code = NotFound desc = could not find container \"2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c\": container with ID starting with 2a0fce7dacba8c67950d2268bd7ec94e19e6b86d061662c494b028805f12962c not found: ID does not exist" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.299110 4725 scope.go:117] "RemoveContainer" containerID="c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b" Feb 27 06:57:20 crc kubenswrapper[4725]: E0227 06:57:20.300204 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b\": container with ID starting with c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b not found: ID does not exist" containerID="c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.300228 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b"} err="failed to get container status \"c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b\": rpc error: code = NotFound desc = could not find container \"c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b\": container with ID starting with c4d6d09cb66ffd0308509f0a5ad56d2a7f05bdd453a65c35f9422a1c3065059b not found: ID does not exist" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.300249 4725 scope.go:117] "RemoveContainer" containerID="fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d" Feb 27 06:57:20 crc kubenswrapper[4725]: E0227 06:57:20.301068 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d\": container with ID starting with fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d not found: ID does not exist" containerID="fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.301091 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d"} err="failed to get container status \"fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d\": rpc error: code = NotFound desc = could not find container \"fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d\": container with ID starting with fd916fea4f74173e78fae0cf2606113601820018365ec3e3071188134c28763d not found: ID does not exist" Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.365819 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.714792 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.991119 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"ba6ad5a5-a980-46a3-8891-5448144c7885","Type":"ContainerStarted","Data":"a7e6c14b4491507ad2dcb9da03ba3a48353a83b96cf3bd182a4b88b77a5f4d78"} Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.991607 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"ba6ad5a5-a980-46a3-8891-5448144c7885","Type":"ContainerStarted","Data":"7140fd954ac44347d8e91baba419279d72946b62cf961345a48d60252ee6904d"} Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.998349 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"be6347d4-c8d8-416d-9229-9671f6a027d4","Type":"ContainerStarted","Data":"d5e3e44c38dca623e79ce7d81b34ad464636ac260e5dca217dbcf745a8681791"} Feb 27 06:57:20 crc kubenswrapper[4725]: I0227 06:57:20.998390 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"be6347d4-c8d8-416d-9229-9671f6a027d4","Type":"ContainerStarted","Data":"e0062b6cedc3fdf78f8578ce788939c59922a19ee7b84799143a3f50a4412ceb"} Feb 27 06:57:21 crc kubenswrapper[4725]: I0227 06:57:21.003965 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d","Type":"ContainerStarted","Data":"1844180bb982022976da088eb39574d03bf8939bfcef6dc035c82ef92af58646"} Feb 27 06:57:21 crc kubenswrapper[4725]: I0227 06:57:21.004165 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d","Type":"ContainerStarted","Data":"9a8cb0724bb463e1f8eec1471ab5757e0fed6f4090aa6ab16b6506bd7e7ca63c"} Feb 27 06:57:22 crc kubenswrapper[4725]: I0227 06:57:22.013278 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"1c1a66bf-70db-4738-ae7d-4fd930ec4f4d","Type":"ContainerStarted","Data":"7cebc6cb521580eb8f3e4a61c012300a37f77f4e4349c6ed7230c7af42b8fbb4"} Feb 27 06:57:22 crc kubenswrapper[4725]: I0227 06:57:22.015367 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"ba6ad5a5-a980-46a3-8891-5448144c7885","Type":"ContainerStarted","Data":"9a989371be25cdd8cf9768a7a2d7f5ff1cc731c6e2422cf8234d3d52722170f6"} Feb 27 06:57:22 crc kubenswrapper[4725]: I0227 06:57:22.018686 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"be6347d4-c8d8-416d-9229-9671f6a027d4","Type":"ContainerStarted","Data":"d1ae4b159938b8c3c4adefd5829dbb9d4a8d10c966f59f7795517a7345c28ab4"} Feb 27 06:57:22 crc kubenswrapper[4725]: I0227 06:57:22.043033 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.7497666450000002 podStartE2EDuration="3.043015739s" podCreationTimestamp="2026-02-27 06:57:19 +0000 UTC" firstStartedPulling="2026-02-27 06:57:20.368714876 +0000 UTC m=+2818.831335445" lastFinishedPulling="2026-02-27 06:57:20.66196397 +0000 UTC m=+2819.124584539" observedRunningTime="2026-02-27 06:57:22.039080938 +0000 UTC m=+2820.501701507" watchObservedRunningTime="2026-02-27 06:57:22.043015739 +0000 UTC m=+2820.505636308" Feb 27 06:57:22 crc kubenswrapper[4725]: I0227 06:57:22.063454 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=2.742301384 podStartE2EDuration="3.063432776s" podCreationTimestamp="2026-02-27 06:57:19 +0000 UTC" firstStartedPulling="2026-02-27 06:57:20.299602193 +0000 UTC m=+2818.762222762" lastFinishedPulling="2026-02-27 06:57:20.620733585 +0000 UTC m=+2819.083354154" observedRunningTime="2026-02-27 06:57:22.061653626 +0000 UTC m=+2820.524274205" watchObservedRunningTime="2026-02-27 06:57:22.063432776 +0000 UTC m=+2820.526053355" Feb 27 06:57:22 crc kubenswrapper[4725]: I0227 06:57:22.084530 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=3.084510162 podStartE2EDuration="3.084510162s" podCreationTimestamp="2026-02-27 06:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:57:22.079923972 +0000 UTC m=+2820.542544561" watchObservedRunningTime="2026-02-27 06:57:22.084510162 +0000 UTC m=+2820.547130741" Feb 27 06:57:24 crc kubenswrapper[4725]: I0227 06:57:24.629840 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 27 06:57:24 crc kubenswrapper[4725]: I0227 06:57:24.825735 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Feb 27 06:57:25 crc kubenswrapper[4725]: I0227 06:57:25.046457 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:25 crc kubenswrapper[4725]: I0227 06:57:25.190093 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-nfs-2-0" podUID="be6347d4-c8d8-416d-9229-9671f6a027d4" containerName="cinder-volume" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 06:57:29 crc kubenswrapper[4725]: I0227 06:57:29.877616 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 27 06:57:30 crc kubenswrapper[4725]: I0227 06:57:30.078514 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Feb 27 06:57:30 crc kubenswrapper[4725]: I0227 06:57:30.109240 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.171073 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536258-hddlw"] Feb 27 06:58:00 crc kubenswrapper[4725]: E0227 06:58:00.172808 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="registry-server" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.172844 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="registry-server" Feb 27 06:58:00 crc kubenswrapper[4725]: E0227 06:58:00.172903 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="extract-content" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.172922 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="extract-content" Feb 27 06:58:00 crc kubenswrapper[4725]: E0227 06:58:00.172979 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="extract-utilities" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.172999 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="extract-utilities" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.173649 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ccf802-5d61-4bb1-a794-d9601e42689e" containerName="registry-server" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.175282 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536258-hddlw" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.178555 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.178769 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.179321 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.186582 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536258-hddlw"] Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.283267 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7skn\" (UniqueName: \"kubernetes.io/projected/d18a4b60-644f-4d9b-8679-c1034e916b8e-kube-api-access-h7skn\") pod \"auto-csr-approver-29536258-hddlw\" (UID: \"d18a4b60-644f-4d9b-8679-c1034e916b8e\") " pod="openshift-infra/auto-csr-approver-29536258-hddlw" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.386785 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7skn\" (UniqueName: \"kubernetes.io/projected/d18a4b60-644f-4d9b-8679-c1034e916b8e-kube-api-access-h7skn\") pod \"auto-csr-approver-29536258-hddlw\" (UID: \"d18a4b60-644f-4d9b-8679-c1034e916b8e\") " pod="openshift-infra/auto-csr-approver-29536258-hddlw" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.411359 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7skn\" (UniqueName: \"kubernetes.io/projected/d18a4b60-644f-4d9b-8679-c1034e916b8e-kube-api-access-h7skn\") pod \"auto-csr-approver-29536258-hddlw\" (UID: \"d18a4b60-644f-4d9b-8679-c1034e916b8e\") " pod="openshift-infra/auto-csr-approver-29536258-hddlw" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.504679 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536258-hddlw" Feb 27 06:58:00 crc kubenswrapper[4725]: I0227 06:58:00.986259 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536258-hddlw"] Feb 27 06:58:01 crc kubenswrapper[4725]: W0227 06:58:01.005779 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd18a4b60_644f_4d9b_8679_c1034e916b8e.slice/crio-05d98de08ea9b52e99b03ac719c086a43796fa8f4861f12e027cdaf86962e3d0 WatchSource:0}: Error finding container 05d98de08ea9b52e99b03ac719c086a43796fa8f4861f12e027cdaf86962e3d0: Status 404 returned error can't find the container with id 05d98de08ea9b52e99b03ac719c086a43796fa8f4861f12e027cdaf86962e3d0 Feb 27 06:58:01 crc kubenswrapper[4725]: I0227 06:58:01.886673 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536258-hddlw" event={"ID":"d18a4b60-644f-4d9b-8679-c1034e916b8e","Type":"ContainerStarted","Data":"05d98de08ea9b52e99b03ac719c086a43796fa8f4861f12e027cdaf86962e3d0"} Feb 27 06:58:02 crc kubenswrapper[4725]: I0227 06:58:02.903057 4725 generic.go:334] "Generic (PLEG): container finished" podID="d18a4b60-644f-4d9b-8679-c1034e916b8e" containerID="2c22253d1bec1037ab6f4cbdc205aea240c1625bf9b10104212cb4a7432cba7f" exitCode=0 Feb 27 06:58:02 crc kubenswrapper[4725]: I0227 06:58:02.903179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536258-hddlw" event={"ID":"d18a4b60-644f-4d9b-8679-c1034e916b8e","Type":"ContainerDied","Data":"2c22253d1bec1037ab6f4cbdc205aea240c1625bf9b10104212cb4a7432cba7f"} Feb 27 06:58:04 crc kubenswrapper[4725]: I0227 06:58:04.290956 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536258-hddlw" Feb 27 06:58:04 crc kubenswrapper[4725]: I0227 06:58:04.387254 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7skn\" (UniqueName: \"kubernetes.io/projected/d18a4b60-644f-4d9b-8679-c1034e916b8e-kube-api-access-h7skn\") pod \"d18a4b60-644f-4d9b-8679-c1034e916b8e\" (UID: \"d18a4b60-644f-4d9b-8679-c1034e916b8e\") " Feb 27 06:58:04 crc kubenswrapper[4725]: I0227 06:58:04.396962 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18a4b60-644f-4d9b-8679-c1034e916b8e-kube-api-access-h7skn" (OuterVolumeSpecName: "kube-api-access-h7skn") pod "d18a4b60-644f-4d9b-8679-c1034e916b8e" (UID: "d18a4b60-644f-4d9b-8679-c1034e916b8e"). InnerVolumeSpecName "kube-api-access-h7skn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:58:04 crc kubenswrapper[4725]: I0227 06:58:04.490328 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7skn\" (UniqueName: \"kubernetes.io/projected/d18a4b60-644f-4d9b-8679-c1034e916b8e-kube-api-access-h7skn\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:04 crc kubenswrapper[4725]: I0227 06:58:04.929227 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536258-hddlw" event={"ID":"d18a4b60-644f-4d9b-8679-c1034e916b8e","Type":"ContainerDied","Data":"05d98de08ea9b52e99b03ac719c086a43796fa8f4861f12e027cdaf86962e3d0"} Feb 27 06:58:04 crc kubenswrapper[4725]: I0227 06:58:04.929640 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05d98de08ea9b52e99b03ac719c086a43796fa8f4861f12e027cdaf86962e3d0" Feb 27 06:58:04 crc kubenswrapper[4725]: I0227 06:58:04.929347 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536258-hddlw" Feb 27 06:58:05 crc kubenswrapper[4725]: I0227 06:58:05.388917 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536252-wkxbg"] Feb 27 06:58:05 crc kubenswrapper[4725]: I0227 06:58:05.400886 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536252-wkxbg"] Feb 27 06:58:06 crc kubenswrapper[4725]: I0227 06:58:06.273000 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c30d4fe-9081-49a6-a7cf-368637b3fa0c" path="/var/lib/kubelet/pods/8c30d4fe-9081-49a6-a7cf-368637b3fa0c/volumes" Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.038049 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.039103 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="prometheus" containerID="cri-o://8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421" gracePeriod=600 Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.040100 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="thanos-sidecar" containerID="cri-o://44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98" gracePeriod=600 Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.040179 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="config-reloader" containerID="cri-o://cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac" gracePeriod=600 Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.201926 4725 generic.go:334] "Generic (PLEG): container finished" podID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerID="44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98" exitCode=0 Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.201956 4725 generic.go:334] "Generic (PLEG): container finished" podID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerID="8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421" exitCode=0 Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.201978 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerDied","Data":"44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98"} Feb 27 06:58:26 crc kubenswrapper[4725]: I0227 06:58:26.202004 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerDied","Data":"8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421"} Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.086532 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.183305 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-2\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.183494 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1729f58a-98a0-4128-8644-c1a7643f09c8-config-out\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184426 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkhdg\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-kube-api-access-zkhdg\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184508 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184522 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184554 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184580 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-secret-combined-ca-bundle\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184649 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-thanos-prometheus-http-client-file\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184685 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-0\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184757 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-1\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184800 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-config\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184820 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.184940 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-tls-assets\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.185710 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"1729f58a-98a0-4128-8644-c1a7643f09c8\" (UID: \"1729f58a-98a0-4128-8644-c1a7643f09c8\") " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.186012 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.186797 4725 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.186822 4725 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.187139 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.191633 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-kube-api-access-zkhdg" (OuterVolumeSpecName: "kube-api-access-zkhdg") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "kube-api-access-zkhdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.191685 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.192313 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.192714 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-config" (OuterVolumeSpecName: "config") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.192775 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1729f58a-98a0-4128-8644-c1a7643f09c8-config-out" (OuterVolumeSpecName: "config-out") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.193523 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.199961 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.200753 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.224311 4725 generic.go:334] "Generic (PLEG): container finished" podID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerID="cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac" exitCode=0 Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.224369 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerDied","Data":"cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac"} Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.224406 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1729f58a-98a0-4128-8644-c1a7643f09c8","Type":"ContainerDied","Data":"904f0e685b29974d82c20e11a2e80f4578184d3ffab44721b02b0f72630f2367"} Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.225176 4725 scope.go:117] "RemoveContainer" containerID="44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.225521 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.226681 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.291261 4725 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.293296 4725 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1729f58a-98a0-4128-8644-c1a7643f09c8-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.293389 4725 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.293453 4725 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.293520 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") on node \"crc\" " Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.293580 4725 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1729f58a-98a0-4128-8644-c1a7643f09c8-config-out\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.293655 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkhdg\" (UniqueName: \"kubernetes.io/projected/1729f58a-98a0-4128-8644-c1a7643f09c8-kube-api-access-zkhdg\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.293709 4725 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.294116 4725 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.294188 4725 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.319989 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config" (OuterVolumeSpecName: "web-config") pod "1729f58a-98a0-4128-8644-c1a7643f09c8" (UID: "1729f58a-98a0-4128-8644-c1a7643f09c8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.338613 4725 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.338802 4725 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd") on node "crc" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.401851 4725 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1729f58a-98a0-4128-8644-c1a7643f09c8-web-config\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.402182 4725 reconciler_common.go:293] "Volume detached for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") on node \"crc\" DevicePath \"\"" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.415192 4725 scope.go:117] "RemoveContainer" containerID="cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.452493 4725 scope.go:117] "RemoveContainer" containerID="8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.477385 4725 scope.go:117] "RemoveContainer" containerID="970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.508678 4725 scope.go:117] "RemoveContainer" containerID="44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.509106 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98\": container with ID starting with 44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98 not found: ID does not exist" containerID="44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.509145 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98"} err="failed to get container status \"44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98\": rpc error: code = NotFound desc = could not find container \"44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98\": container with ID starting with 44e19c2922434c17b1998301a64e8aec70fb126f3719e9e10e66323692547e98 not found: ID does not exist" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.509171 4725 scope.go:117] "RemoveContainer" containerID="cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.509510 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac\": container with ID starting with cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac not found: ID does not exist" containerID="cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.509542 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac"} err="failed to get container status \"cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac\": rpc error: code = NotFound desc = could not find container \"cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac\": container with ID starting with cf4ecc6c39d9c761e7b806177dac268d33209f1720af0fc59f79ad9061a77cac not found: ID does not exist" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.509562 4725 scope.go:117] "RemoveContainer" containerID="8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.509924 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421\": container with ID starting with 8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421 not found: ID does not exist" containerID="8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.509959 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421"} err="failed to get container status \"8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421\": rpc error: code = NotFound desc = could not find container \"8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421\": container with ID starting with 8c101a6c6b028a528a730b6c3a028fa1a76bde9334b629a449e596d512569421 not found: ID does not exist" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.509977 4725 scope.go:117] "RemoveContainer" containerID="970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.512540 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c\": container with ID starting with 970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c not found: ID does not exist" containerID="970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.512598 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c"} err="failed to get container status \"970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c\": rpc error: code = NotFound desc = could not find container \"970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c\": container with ID starting with 970c2985c32b336c4bfe030cc080e7c3b3f734851abb34c953070eae1dda504c not found: ID does not exist" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.572349 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.602260 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.615516 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.619894 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18a4b60-644f-4d9b-8679-c1034e916b8e" containerName="oc" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.619935 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18a4b60-644f-4d9b-8679-c1034e916b8e" containerName="oc" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.619960 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="thanos-sidecar" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.619969 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="thanos-sidecar" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.619986 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="prometheus" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.619994 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="prometheus" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.620012 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="config-reloader" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.620021 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="config-reloader" Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.620040 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="init-config-reloader" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.620048 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="init-config-reloader" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.620312 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="config-reloader" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.620332 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="prometheus" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.620350 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" containerName="thanos-sidecar" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.620365 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18a4b60-644f-4d9b-8679-c1034e916b8e" containerName="oc" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.622766 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.633555 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.633683 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.633800 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.633881 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.633932 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.634214 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.634915 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l87pd" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.637908 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.642232 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:58:27 crc kubenswrapper[4725]: E0227 06:58:27.754073 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1729f58a_98a0_4128_8644_c1a7643f09c8.slice/crio-904f0e685b29974d82c20e11a2e80f4578184d3ffab44721b02b0f72630f2367\": RecentStats: unable to find data in memory cache]" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.811540 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-config\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.811860 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.812011 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.812196 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5t67\" (UniqueName: \"kubernetes.io/projected/06478af8-d30f-4c96-9dbc-360abe61500b-kube-api-access-g5t67\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.812358 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.812566 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.812790 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.812881 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.812997 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.813115 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.813229 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.813340 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06478af8-d30f-4c96-9dbc-360abe61500b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.813438 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06478af8-d30f-4c96-9dbc-360abe61500b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915540 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915610 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06478af8-d30f-4c96-9dbc-360abe61500b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915637 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06478af8-d30f-4c96-9dbc-360abe61500b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915660 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-config\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915695 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915755 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915808 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5t67\" (UniqueName: \"kubernetes.io/projected/06478af8-d30f-4c96-9dbc-360abe61500b-kube-api-access-g5t67\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.915922 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.916103 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.916131 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.916155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.916183 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.916215 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.916758 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.916954 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.917434 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06478af8-d30f-4c96-9dbc-360abe61500b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.920884 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.921366 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.921746 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06478af8-d30f-4c96-9dbc-360abe61500b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.921804 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.922861 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.922898 4725 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.922927 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e30db3ebf2fd7ac6c73c4f03a68dbdb833990d29c0091afb2dd5fd8d7a51236/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.923377 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.923405 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/06478af8-d30f-4c96-9dbc-360abe61500b-config\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.925670 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06478af8-d30f-4c96-9dbc-360abe61500b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.935171 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5t67\" (UniqueName: \"kubernetes.io/projected/06478af8-d30f-4c96-9dbc-360abe61500b-kube-api-access-g5t67\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:27 crc kubenswrapper[4725]: I0227 06:58:27.966401 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd981ab8-e2fc-4994-a89f-14c6ce47c7cd\") pod \"prometheus-metric-storage-0\" (UID: \"06478af8-d30f-4c96-9dbc-360abe61500b\") " pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:28 crc kubenswrapper[4725]: I0227 06:58:28.246207 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:28 crc kubenswrapper[4725]: I0227 06:58:28.266218 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1729f58a-98a0-4128-8644-c1a7643f09c8" path="/var/lib/kubelet/pods/1729f58a-98a0-4128-8644-c1a7643f09c8/volumes" Feb 27 06:58:28 crc kubenswrapper[4725]: I0227 06:58:28.757063 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 06:58:29 crc kubenswrapper[4725]: I0227 06:58:29.252640 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"06478af8-d30f-4c96-9dbc-360abe61500b","Type":"ContainerStarted","Data":"48ccc23ddea19260d19b02be97ce8a26998c72b13852b8e427f9bc0c1ebfa14c"} Feb 27 06:58:32 crc kubenswrapper[4725]: I0227 06:58:32.292769 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"06478af8-d30f-4c96-9dbc-360abe61500b","Type":"ContainerStarted","Data":"6048ff6fbbb53dfbd48a19091c49d877183b0ac00a989db6b87473de95515431"} Feb 27 06:58:32 crc kubenswrapper[4725]: I0227 06:58:32.554718 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:58:32 crc kubenswrapper[4725]: I0227 06:58:32.555241 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:58:40 crc kubenswrapper[4725]: I0227 06:58:40.382689 4725 generic.go:334] "Generic (PLEG): container finished" podID="06478af8-d30f-4c96-9dbc-360abe61500b" containerID="6048ff6fbbb53dfbd48a19091c49d877183b0ac00a989db6b87473de95515431" exitCode=0 Feb 27 06:58:40 crc kubenswrapper[4725]: I0227 06:58:40.382826 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"06478af8-d30f-4c96-9dbc-360abe61500b","Type":"ContainerDied","Data":"6048ff6fbbb53dfbd48a19091c49d877183b0ac00a989db6b87473de95515431"} Feb 27 06:58:41 crc kubenswrapper[4725]: I0227 06:58:41.399365 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"06478af8-d30f-4c96-9dbc-360abe61500b","Type":"ContainerStarted","Data":"507d1444208ffae4dbc7769f443852cb0ca1d77d0439a2db66f65a650196a26a"} Feb 27 06:58:44 crc kubenswrapper[4725]: I0227 06:58:44.434812 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"06478af8-d30f-4c96-9dbc-360abe61500b","Type":"ContainerStarted","Data":"a3cd0aa7d066e224568cd4c76768a7e7bc900b98a993379d3ab2da4ad0167646"} Feb 27 06:58:45 crc kubenswrapper[4725]: I0227 06:58:45.453208 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"06478af8-d30f-4c96-9dbc-360abe61500b","Type":"ContainerStarted","Data":"3c1375a71577f1fe410880731309f83424415bc081bb208914ea280932f327a0"} Feb 27 06:58:45 crc kubenswrapper[4725]: I0227 06:58:45.544170 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.544145349 podStartE2EDuration="18.544145349s" podCreationTimestamp="2026-02-27 06:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 06:58:45.536628357 +0000 UTC m=+2903.999248936" watchObservedRunningTime="2026-02-27 06:58:45.544145349 +0000 UTC m=+2904.006765918" Feb 27 06:58:48 crc kubenswrapper[4725]: I0227 06:58:48.246554 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.297712 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8nl9m"] Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.300938 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.315745 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nl9m"] Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.316495 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg42z\" (UniqueName: \"kubernetes.io/projected/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-kube-api-access-pg42z\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.316578 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-utilities\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.316650 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-catalog-content\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.419161 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-catalog-content\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.419348 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg42z\" (UniqueName: \"kubernetes.io/projected/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-kube-api-access-pg42z\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.419414 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-utilities\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.419666 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-catalog-content\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.419913 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-utilities\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.459721 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg42z\" (UniqueName: \"kubernetes.io/projected/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-kube-api-access-pg42z\") pod \"community-operators-8nl9m\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.467895 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72d9t"] Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.470104 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.495476 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72d9t"] Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.521645 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-utilities\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.521722 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-catalog-content\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.521758 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrl8k\" (UniqueName: \"kubernetes.io/projected/7e9a4083-c9d6-4059-85d5-dd5fef760048-kube-api-access-xrl8k\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.624849 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.625403 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-utilities\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.625483 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-catalog-content\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.625528 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrl8k\" (UniqueName: \"kubernetes.io/projected/7e9a4083-c9d6-4059-85d5-dd5fef760048-kube-api-access-xrl8k\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.626600 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-utilities\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.627255 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-catalog-content\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.649084 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrl8k\" (UniqueName: \"kubernetes.io/projected/7e9a4083-c9d6-4059-85d5-dd5fef760048-kube-api-access-xrl8k\") pod \"certified-operators-72d9t\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:57 crc kubenswrapper[4725]: I0227 06:58:57.815855 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.210012 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nl9m"] Feb 27 06:58:58 crc kubenswrapper[4725]: W0227 06:58:58.225643 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-90128566eec262770317237568671899f8fca33c26014043e36ed7477f9cd1a6 WatchSource:0}: Error finding container 90128566eec262770317237568671899f8fca33c26014043e36ed7477f9cd1a6: Status 404 returned error can't find the container with id 90128566eec262770317237568671899f8fca33c26014043e36ed7477f9cd1a6 Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.248002 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.273032 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.430718 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72d9t"] Feb 27 06:58:58 crc kubenswrapper[4725]: E0227 06:58:58.607802 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-conmon-0d7ef55a496e1475c14594177f285b6f753aef6046fe27e738dd29adc76d793a.scope\": RecentStats: unable to find data in memory cache]" Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.613153 4725 generic.go:334] "Generic (PLEG): container finished" podID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerID="0d7ef55a496e1475c14594177f285b6f753aef6046fe27e738dd29adc76d793a" exitCode=0 Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.613214 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nl9m" event={"ID":"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde","Type":"ContainerDied","Data":"0d7ef55a496e1475c14594177f285b6f753aef6046fe27e738dd29adc76d793a"} Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.613418 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nl9m" event={"ID":"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde","Type":"ContainerStarted","Data":"90128566eec262770317237568671899f8fca33c26014043e36ed7477f9cd1a6"} Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.616783 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72d9t" event={"ID":"7e9a4083-c9d6-4059-85d5-dd5fef760048","Type":"ContainerStarted","Data":"16ed9a49d9e62039e36ad6f1de454b0a6c47da7e440caaad17196d6b1437487b"} Feb 27 06:58:58 crc kubenswrapper[4725]: I0227 06:58:58.630309 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 27 06:58:59 crc kubenswrapper[4725]: I0227 06:58:59.630437 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nl9m" event={"ID":"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde","Type":"ContainerStarted","Data":"7f70efae7f120c37b51032654a72bcba42ac69833a082020e6735663b594690d"} Feb 27 06:58:59 crc kubenswrapper[4725]: I0227 06:58:59.632334 4725 generic.go:334] "Generic (PLEG): container finished" podID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerID="3f16505213edd59ee2b2e24b9a6431fe48d58d62ced23b40824d6ef57380c629" exitCode=0 Feb 27 06:58:59 crc kubenswrapper[4725]: I0227 06:58:59.632767 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72d9t" event={"ID":"7e9a4083-c9d6-4059-85d5-dd5fef760048","Type":"ContainerDied","Data":"3f16505213edd59ee2b2e24b9a6431fe48d58d62ced23b40824d6ef57380c629"} Feb 27 06:59:00 crc kubenswrapper[4725]: I0227 06:59:00.650924 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72d9t" event={"ID":"7e9a4083-c9d6-4059-85d5-dd5fef760048","Type":"ContainerStarted","Data":"45b69d93649c63761f6e61746b63ebe9a185df9999da17b48bd5b133fa7ffda5"} Feb 27 06:59:01 crc kubenswrapper[4725]: I0227 06:59:01.652068 4725 generic.go:334] "Generic (PLEG): container finished" podID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerID="7f70efae7f120c37b51032654a72bcba42ac69833a082020e6735663b594690d" exitCode=0 Feb 27 06:59:01 crc kubenswrapper[4725]: I0227 06:59:01.653028 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nl9m" event={"ID":"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde","Type":"ContainerDied","Data":"7f70efae7f120c37b51032654a72bcba42ac69833a082020e6735663b594690d"} Feb 27 06:59:01 crc kubenswrapper[4725]: I0227 06:59:01.655776 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 06:59:02 crc kubenswrapper[4725]: I0227 06:59:02.554270 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:59:02 crc kubenswrapper[4725]: I0227 06:59:02.554623 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:59:02 crc kubenswrapper[4725]: I0227 06:59:02.675002 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nl9m" event={"ID":"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde","Type":"ContainerStarted","Data":"546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4"} Feb 27 06:59:02 crc kubenswrapper[4725]: I0227 06:59:02.707110 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8nl9m" podStartSLOduration=2.224082895 podStartE2EDuration="5.70709168s" podCreationTimestamp="2026-02-27 06:58:57 +0000 UTC" firstStartedPulling="2026-02-27 06:58:58.615653931 +0000 UTC m=+2917.078274500" lastFinishedPulling="2026-02-27 06:59:02.098662716 +0000 UTC m=+2920.561283285" observedRunningTime="2026-02-27 06:59:02.69789331 +0000 UTC m=+2921.160513899" watchObservedRunningTime="2026-02-27 06:59:02.70709168 +0000 UTC m=+2921.169712249" Feb 27 06:59:03 crc kubenswrapper[4725]: I0227 06:59:03.686079 4725 generic.go:334] "Generic (PLEG): container finished" podID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerID="45b69d93649c63761f6e61746b63ebe9a185df9999da17b48bd5b133fa7ffda5" exitCode=0 Feb 27 06:59:03 crc kubenswrapper[4725]: I0227 06:59:03.686464 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72d9t" event={"ID":"7e9a4083-c9d6-4059-85d5-dd5fef760048","Type":"ContainerDied","Data":"45b69d93649c63761f6e61746b63ebe9a185df9999da17b48bd5b133fa7ffda5"} Feb 27 06:59:04 crc kubenswrapper[4725]: I0227 06:59:04.421179 4725 scope.go:117] "RemoveContainer" containerID="288837e3e4e3f76ded5e21704f5af7621f95172ee4b87dd6a4f989cf1f63b58f" Feb 27 06:59:04 crc kubenswrapper[4725]: I0227 06:59:04.700332 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72d9t" event={"ID":"7e9a4083-c9d6-4059-85d5-dd5fef760048","Type":"ContainerStarted","Data":"13365fc3187adef3fb3f93785242333a32116046fb72d10cc11b57831df308f0"} Feb 27 06:59:04 crc kubenswrapper[4725]: I0227 06:59:04.724001 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72d9t" podStartSLOduration=3.288226772 podStartE2EDuration="7.72397843s" podCreationTimestamp="2026-02-27 06:58:57 +0000 UTC" firstStartedPulling="2026-02-27 06:58:59.634975323 +0000 UTC m=+2918.097595902" lastFinishedPulling="2026-02-27 06:59:04.070726981 +0000 UTC m=+2922.533347560" observedRunningTime="2026-02-27 06:59:04.716761156 +0000 UTC m=+2923.179381725" watchObservedRunningTime="2026-02-27 06:59:04.72397843 +0000 UTC m=+2923.186598999" Feb 27 06:59:07 crc kubenswrapper[4725]: I0227 06:59:07.625777 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:59:07 crc kubenswrapper[4725]: I0227 06:59:07.626116 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:59:07 crc kubenswrapper[4725]: I0227 06:59:07.816127 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:59:07 crc kubenswrapper[4725]: I0227 06:59:07.816204 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:59:07 crc kubenswrapper[4725]: I0227 06:59:07.872651 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:59:08 crc kubenswrapper[4725]: I0227 06:59:08.673805 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8nl9m" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="registry-server" probeResult="failure" output=< Feb 27 06:59:08 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 06:59:08 crc kubenswrapper[4725]: > Feb 27 06:59:17 crc kubenswrapper[4725]: I0227 06:59:17.699827 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:59:17 crc kubenswrapper[4725]: I0227 06:59:17.747002 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:59:17 crc kubenswrapper[4725]: I0227 06:59:17.889413 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.239626 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nl9m"] Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.240422 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8nl9m" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="registry-server" containerID="cri-o://546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4" gracePeriod=2 Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.369629 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.371296 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.373489 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p8rhj" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.374811 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.375037 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.375622 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.397823 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.542549 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.542628 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.542774 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.542908 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.543129 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6gh7\" (UniqueName: \"kubernetes.io/projected/a4ededce-4af9-418c-af09-c79e79cb044f-kube-api-access-h6gh7\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.543186 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.543280 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.543379 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.543458 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-config-data\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.643720 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72d9t"] Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.645451 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72d9t" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="registry-server" containerID="cri-o://13365fc3187adef3fb3f93785242333a32116046fb72d10cc11b57831df308f0" gracePeriod=2 Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650341 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650434 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650510 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650576 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650691 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6gh7\" (UniqueName: \"kubernetes.io/projected/a4ededce-4af9-418c-af09-c79e79cb044f-kube-api-access-h6gh7\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650743 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650908 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.650995 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.652453 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.652792 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-config-data\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.654037 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.654522 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.655045 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.656931 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-config-data\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.689688 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.692675 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.693561 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.702469 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6gh7\" (UniqueName: \"kubernetes.io/projected/a4ededce-4af9-418c-af09-c79e79cb044f-kube-api-access-h6gh7\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.732462 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.744180 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.889177 4725 generic.go:334] "Generic (PLEG): container finished" podID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerID="546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4" exitCode=0 Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.889208 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nl9m" event={"ID":"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde","Type":"ContainerDied","Data":"546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4"} Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.889827 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nl9m" event={"ID":"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde","Type":"ContainerDied","Data":"90128566eec262770317237568671899f8fca33c26014043e36ed7477f9cd1a6"} Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.889851 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90128566eec262770317237568671899f8fca33c26014043e36ed7477f9cd1a6" Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.923745 4725 generic.go:334] "Generic (PLEG): container finished" podID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerID="13365fc3187adef3fb3f93785242333a32116046fb72d10cc11b57831df308f0" exitCode=0 Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.923790 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72d9t" event={"ID":"7e9a4083-c9d6-4059-85d5-dd5fef760048","Type":"ContainerDied","Data":"13365fc3187adef3fb3f93785242333a32116046fb72d10cc11b57831df308f0"} Feb 27 06:59:22 crc kubenswrapper[4725]: I0227 06:59:22.960904 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.081094 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-utilities\") pod \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.081355 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-catalog-content\") pod \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.081665 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg42z\" (UniqueName: \"kubernetes.io/projected/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-kube-api-access-pg42z\") pod \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\" (UID: \"1d633e54-e2ff-4c65-9cb6-f3ba5279fbde\") " Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.083263 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-utilities" (OuterVolumeSpecName: "utilities") pod "1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" (UID: "1d633e54-e2ff-4c65-9cb6-f3ba5279fbde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.092174 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-kube-api-access-pg42z" (OuterVolumeSpecName: "kube-api-access-pg42z") pod "1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" (UID: "1d633e54-e2ff-4c65-9cb6-f3ba5279fbde"). InnerVolumeSpecName "kube-api-access-pg42z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.170903 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" (UID: "1d633e54-e2ff-4c65-9cb6-f3ba5279fbde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.183511 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.183536 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg42z\" (UniqueName: \"kubernetes.io/projected/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-kube-api-access-pg42z\") on node \"crc\" DevicePath \"\"" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.183547 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.187319 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.372607 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 06:59:23 crc kubenswrapper[4725]: W0227 06:59:23.380101 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ededce_4af9_418c_af09_c79e79cb044f.slice/crio-3438e28c42b333e3d9f54615c9e4c5ec3f1e1b10f7e5b0d54f340d9b2e7f8476 WatchSource:0}: Error finding container 3438e28c42b333e3d9f54615c9e4c5ec3f1e1b10f7e5b0d54f340d9b2e7f8476: Status 404 returned error can't find the container with id 3438e28c42b333e3d9f54615c9e4c5ec3f1e1b10f7e5b0d54f340d9b2e7f8476 Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.386940 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-utilities\") pod \"7e9a4083-c9d6-4059-85d5-dd5fef760048\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.387093 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-catalog-content\") pod \"7e9a4083-c9d6-4059-85d5-dd5fef760048\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.387317 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrl8k\" (UniqueName: \"kubernetes.io/projected/7e9a4083-c9d6-4059-85d5-dd5fef760048-kube-api-access-xrl8k\") pod \"7e9a4083-c9d6-4059-85d5-dd5fef760048\" (UID: \"7e9a4083-c9d6-4059-85d5-dd5fef760048\") " Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.388009 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-utilities" (OuterVolumeSpecName: "utilities") pod "7e9a4083-c9d6-4059-85d5-dd5fef760048" (UID: "7e9a4083-c9d6-4059-85d5-dd5fef760048"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.395721 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9a4083-c9d6-4059-85d5-dd5fef760048-kube-api-access-xrl8k" (OuterVolumeSpecName: "kube-api-access-xrl8k") pod "7e9a4083-c9d6-4059-85d5-dd5fef760048" (UID: "7e9a4083-c9d6-4059-85d5-dd5fef760048"). InnerVolumeSpecName "kube-api-access-xrl8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.463230 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e9a4083-c9d6-4059-85d5-dd5fef760048" (UID: "7e9a4083-c9d6-4059-85d5-dd5fef760048"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.490750 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.490778 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e9a4083-c9d6-4059-85d5-dd5fef760048-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.490788 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrl8k\" (UniqueName: \"kubernetes.io/projected/7e9a4083-c9d6-4059-85d5-dd5fef760048-kube-api-access-xrl8k\") on node \"crc\" DevicePath \"\"" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.934209 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4ededce-4af9-418c-af09-c79e79cb044f","Type":"ContainerStarted","Data":"3438e28c42b333e3d9f54615c9e4c5ec3f1e1b10f7e5b0d54f340d9b2e7f8476"} Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.936416 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nl9m" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.936482 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72d9t" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.936523 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72d9t" event={"ID":"7e9a4083-c9d6-4059-85d5-dd5fef760048","Type":"ContainerDied","Data":"16ed9a49d9e62039e36ad6f1de454b0a6c47da7e440caaad17196d6b1437487b"} Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.936561 4725 scope.go:117] "RemoveContainer" containerID="13365fc3187adef3fb3f93785242333a32116046fb72d10cc11b57831df308f0" Feb 27 06:59:23 crc kubenswrapper[4725]: I0227 06:59:23.995279 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72d9t"] Feb 27 06:59:24 crc kubenswrapper[4725]: I0227 06:59:24.002042 4725 scope.go:117] "RemoveContainer" containerID="45b69d93649c63761f6e61746b63ebe9a185df9999da17b48bd5b133fa7ffda5" Feb 27 06:59:24 crc kubenswrapper[4725]: I0227 06:59:24.011161 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72d9t"] Feb 27 06:59:24 crc kubenswrapper[4725]: I0227 06:59:24.022001 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nl9m"] Feb 27 06:59:24 crc kubenswrapper[4725]: I0227 06:59:24.028564 4725 scope.go:117] "RemoveContainer" containerID="3f16505213edd59ee2b2e24b9a6431fe48d58d62ced23b40824d6ef57380c629" Feb 27 06:59:24 crc kubenswrapper[4725]: I0227 06:59:24.031413 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8nl9m"] Feb 27 06:59:24 crc kubenswrapper[4725]: I0227 06:59:24.263355 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" path="/var/lib/kubelet/pods/1d633e54-e2ff-4c65-9cb6-f3ba5279fbde/volumes" Feb 27 06:59:24 crc kubenswrapper[4725]: I0227 06:59:24.263995 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" path="/var/lib/kubelet/pods/7e9a4083-c9d6-4059-85d5-dd5fef760048/volumes" Feb 27 06:59:29 crc kubenswrapper[4725]: E0227 06:59:29.493469 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-conmon-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 06:59:32 crc kubenswrapper[4725]: I0227 06:59:32.554263 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 06:59:32 crc kubenswrapper[4725]: I0227 06:59:32.554940 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 06:59:32 crc kubenswrapper[4725]: I0227 06:59:32.554991 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 06:59:32 crc kubenswrapper[4725]: I0227 06:59:32.556109 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 06:59:32 crc kubenswrapper[4725]: I0227 06:59:32.556168 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" gracePeriod=600 Feb 27 06:59:33 crc kubenswrapper[4725]: I0227 06:59:33.061147 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" exitCode=0 Feb 27 06:59:33 crc kubenswrapper[4725]: I0227 06:59:33.061198 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa"} Feb 27 06:59:33 crc kubenswrapper[4725]: I0227 06:59:33.061235 4725 scope.go:117] "RemoveContainer" containerID="dc4f505db5c927b1115d9bb5cb4e6488533e4b86cbba3fd83d6f19a11a64f81b" Feb 27 06:59:36 crc kubenswrapper[4725]: E0227 06:59:36.070005 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:59:36 crc kubenswrapper[4725]: I0227 06:59:36.109155 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 06:59:36 crc kubenswrapper[4725]: E0227 06:59:36.109623 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:59:37 crc kubenswrapper[4725]: I0227 06:59:37.125115 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4ededce-4af9-418c-af09-c79e79cb044f","Type":"ContainerStarted","Data":"60870ee4b4edbe49c0eccdf1ff1496c815205624b5ab0c6d0e3e4b77cf442e6a"} Feb 27 06:59:37 crc kubenswrapper[4725]: I0227 06:59:37.159685 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.364852971 podStartE2EDuration="16.159658949s" podCreationTimestamp="2026-02-27 06:59:21 +0000 UTC" firstStartedPulling="2026-02-27 06:59:23.382076771 +0000 UTC m=+2941.844697350" lastFinishedPulling="2026-02-27 06:59:36.176882759 +0000 UTC m=+2954.639503328" observedRunningTime="2026-02-27 06:59:37.150869791 +0000 UTC m=+2955.613490370" watchObservedRunningTime="2026-02-27 06:59:37.159658949 +0000 UTC m=+2955.622279518" Feb 27 06:59:39 crc kubenswrapper[4725]: E0227 06:59:39.795247 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-conmon-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 06:59:48 crc kubenswrapper[4725]: I0227 06:59:48.250824 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 06:59:48 crc kubenswrapper[4725]: E0227 06:59:48.251618 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 06:59:50 crc kubenswrapper[4725]: E0227 06:59:50.164580 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-conmon-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.170578 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536260-g7r92"] Feb 27 07:00:00 crc kubenswrapper[4725]: E0227 07:00:00.171669 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="extract-utilities" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.171691 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="extract-utilities" Feb 27 07:00:00 crc kubenswrapper[4725]: E0227 07:00:00.171726 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="extract-utilities" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.171733 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="extract-utilities" Feb 27 07:00:00 crc kubenswrapper[4725]: E0227 07:00:00.171746 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="extract-content" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.171753 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="extract-content" Feb 27 07:00:00 crc kubenswrapper[4725]: E0227 07:00:00.171766 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="registry-server" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.171776 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="registry-server" Feb 27 07:00:00 crc kubenswrapper[4725]: E0227 07:00:00.171793 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="registry-server" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.171802 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="registry-server" Feb 27 07:00:00 crc kubenswrapper[4725]: E0227 07:00:00.171834 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="extract-content" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.171843 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="extract-content" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.172113 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d633e54-e2ff-4c65-9cb6-f3ba5279fbde" containerName="registry-server" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.172147 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9a4083-c9d6-4059-85d5-dd5fef760048" containerName="registry-server" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.173033 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536260-g7r92" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.177705 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.178105 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.184184 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc"] Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.186028 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.186681 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.191553 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.194107 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.195334 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536260-g7r92"] Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.209395 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc"] Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.229824 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-secret-volume\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.229961 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2h9x\" (UniqueName: \"kubernetes.io/projected/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-kube-api-access-k2h9x\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.229986 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5hc4\" (UniqueName: \"kubernetes.io/projected/f348fec4-84f9-472f-b06d-a053375f2ffb-kube-api-access-q5hc4\") pod \"auto-csr-approver-29536260-g7r92\" (UID: \"f348fec4-84f9-472f-b06d-a053375f2ffb\") " pod="openshift-infra/auto-csr-approver-29536260-g7r92" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.230002 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-config-volume\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.331434 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2h9x\" (UniqueName: \"kubernetes.io/projected/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-kube-api-access-k2h9x\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.331484 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-config-volume\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.331505 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5hc4\" (UniqueName: \"kubernetes.io/projected/f348fec4-84f9-472f-b06d-a053375f2ffb-kube-api-access-q5hc4\") pod \"auto-csr-approver-29536260-g7r92\" (UID: \"f348fec4-84f9-472f-b06d-a053375f2ffb\") " pod="openshift-infra/auto-csr-approver-29536260-g7r92" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.331596 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-secret-volume\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.333207 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-config-volume\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.349652 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-secret-volume\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.350094 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5hc4\" (UniqueName: \"kubernetes.io/projected/f348fec4-84f9-472f-b06d-a053375f2ffb-kube-api-access-q5hc4\") pod \"auto-csr-approver-29536260-g7r92\" (UID: \"f348fec4-84f9-472f-b06d-a053375f2ffb\") " pod="openshift-infra/auto-csr-approver-29536260-g7r92" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.352081 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2h9x\" (UniqueName: \"kubernetes.io/projected/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-kube-api-access-k2h9x\") pod \"collect-profiles-29536260-ll9cc\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: E0227 07:00:00.455036 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-conmon-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.501538 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536260-g7r92" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.516473 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:00 crc kubenswrapper[4725]: I0227 07:00:00.981779 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536260-g7r92"] Feb 27 07:00:01 crc kubenswrapper[4725]: I0227 07:00:01.052629 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc"] Feb 27 07:00:01 crc kubenswrapper[4725]: W0227 07:00:01.055028 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0f22c8_e06d_4fa8_af82_bc2a826cafa8.slice/crio-04487dbaa9b40e9a17482e03a81cea2bb16642d120686a861b031bb5d14374ed WatchSource:0}: Error finding container 04487dbaa9b40e9a17482e03a81cea2bb16642d120686a861b031bb5d14374ed: Status 404 returned error can't find the container with id 04487dbaa9b40e9a17482e03a81cea2bb16642d120686a861b031bb5d14374ed Feb 27 07:00:01 crc kubenswrapper[4725]: I0227 07:00:01.378727 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536260-g7r92" event={"ID":"f348fec4-84f9-472f-b06d-a053375f2ffb","Type":"ContainerStarted","Data":"1394cef39ef42ad4e142a26ab76a414de3c4dcb30c77808ed94233ffdc3e1f9a"} Feb 27 07:00:01 crc kubenswrapper[4725]: I0227 07:00:01.379995 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" event={"ID":"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8","Type":"ContainerStarted","Data":"62fc08cb7eb896c9f15ce0eb285c6443dbfae34c972b4a374db616d054bd7c9a"} Feb 27 07:00:01 crc kubenswrapper[4725]: I0227 07:00:01.380018 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" event={"ID":"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8","Type":"ContainerStarted","Data":"04487dbaa9b40e9a17482e03a81cea2bb16642d120686a861b031bb5d14374ed"} Feb 27 07:00:01 crc kubenswrapper[4725]: I0227 07:00:01.395660 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" podStartSLOduration=1.395641847 podStartE2EDuration="1.395641847s" podCreationTimestamp="2026-02-27 07:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 07:00:01.39468207 +0000 UTC m=+2979.857302639" watchObservedRunningTime="2026-02-27 07:00:01.395641847 +0000 UTC m=+2979.858262416" Feb 27 07:00:02 crc kubenswrapper[4725]: I0227 07:00:02.261777 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:00:02 crc kubenswrapper[4725]: E0227 07:00:02.262845 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:00:02 crc kubenswrapper[4725]: I0227 07:00:02.392221 4725 generic.go:334] "Generic (PLEG): container finished" podID="ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" containerID="62fc08cb7eb896c9f15ce0eb285c6443dbfae34c972b4a374db616d054bd7c9a" exitCode=0 Feb 27 07:00:02 crc kubenswrapper[4725]: I0227 07:00:02.392313 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" event={"ID":"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8","Type":"ContainerDied","Data":"62fc08cb7eb896c9f15ce0eb285c6443dbfae34c972b4a374db616d054bd7c9a"} Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.403206 4725 generic.go:334] "Generic (PLEG): container finished" podID="f348fec4-84f9-472f-b06d-a053375f2ffb" containerID="e0df7edf24f707ce5134d77c1f9dfae348dfeeb04e24ebfdddc38ada58313cfa" exitCode=0 Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.404026 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536260-g7r92" event={"ID":"f348fec4-84f9-472f-b06d-a053375f2ffb","Type":"ContainerDied","Data":"e0df7edf24f707ce5134d77c1f9dfae348dfeeb04e24ebfdddc38ada58313cfa"} Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.769328 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.809956 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2h9x\" (UniqueName: \"kubernetes.io/projected/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-kube-api-access-k2h9x\") pod \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.810163 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-secret-volume\") pod \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.810209 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-config-volume\") pod \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\" (UID: \"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8\") " Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.811047 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-config-volume" (OuterVolumeSpecName: "config-volume") pod "ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" (UID: "ce0f22c8-e06d-4fa8-af82-bc2a826cafa8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.824690 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" (UID: "ce0f22c8-e06d-4fa8-af82-bc2a826cafa8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.829435 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-kube-api-access-k2h9x" (OuterVolumeSpecName: "kube-api-access-k2h9x") pod "ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" (UID: "ce0f22c8-e06d-4fa8-af82-bc2a826cafa8"). InnerVolumeSpecName "kube-api-access-k2h9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.912773 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2h9x\" (UniqueName: \"kubernetes.io/projected/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-kube-api-access-k2h9x\") on node \"crc\" DevicePath \"\"" Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.912814 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:00:03 crc kubenswrapper[4725]: I0227 07:00:03.912825 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.416983 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" event={"ID":"ce0f22c8-e06d-4fa8-af82-bc2a826cafa8","Type":"ContainerDied","Data":"04487dbaa9b40e9a17482e03a81cea2bb16642d120686a861b031bb5d14374ed"} Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.417045 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04487dbaa9b40e9a17482e03a81cea2bb16642d120686a861b031bb5d14374ed" Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.419152 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc" Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.478774 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp"] Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.523823 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536215-v2ktp"] Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.858815 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536260-g7r92" Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.936643 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5hc4\" (UniqueName: \"kubernetes.io/projected/f348fec4-84f9-472f-b06d-a053375f2ffb-kube-api-access-q5hc4\") pod \"f348fec4-84f9-472f-b06d-a053375f2ffb\" (UID: \"f348fec4-84f9-472f-b06d-a053375f2ffb\") " Feb 27 07:00:04 crc kubenswrapper[4725]: I0227 07:00:04.943251 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f348fec4-84f9-472f-b06d-a053375f2ffb-kube-api-access-q5hc4" (OuterVolumeSpecName: "kube-api-access-q5hc4") pod "f348fec4-84f9-472f-b06d-a053375f2ffb" (UID: "f348fec4-84f9-472f-b06d-a053375f2ffb"). InnerVolumeSpecName "kube-api-access-q5hc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:00:05 crc kubenswrapper[4725]: I0227 07:00:05.039111 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5hc4\" (UniqueName: \"kubernetes.io/projected/f348fec4-84f9-472f-b06d-a053375f2ffb-kube-api-access-q5hc4\") on node \"crc\" DevicePath \"\"" Feb 27 07:00:05 crc kubenswrapper[4725]: I0227 07:00:05.430358 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536260-g7r92" event={"ID":"f348fec4-84f9-472f-b06d-a053375f2ffb","Type":"ContainerDied","Data":"1394cef39ef42ad4e142a26ab76a414de3c4dcb30c77808ed94233ffdc3e1f9a"} Feb 27 07:00:05 crc kubenswrapper[4725]: I0227 07:00:05.430628 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1394cef39ef42ad4e142a26ab76a414de3c4dcb30c77808ed94233ffdc3e1f9a" Feb 27 07:00:05 crc kubenswrapper[4725]: I0227 07:00:05.430452 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536260-g7r92" Feb 27 07:00:05 crc kubenswrapper[4725]: I0227 07:00:05.931152 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536254-zmcfp"] Feb 27 07:00:05 crc kubenswrapper[4725]: I0227 07:00:05.942195 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536254-zmcfp"] Feb 27 07:00:06 crc kubenswrapper[4725]: I0227 07:00:06.262200 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08cdf6e1-7c17-4514-9106-be74317e08b1" path="/var/lib/kubelet/pods/08cdf6e1-7c17-4514-9106-be74317e08b1/volumes" Feb 27 07:00:06 crc kubenswrapper[4725]: I0227 07:00:06.263805 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c966250-ab30-4713-ba4f-c19bd653309d" path="/var/lib/kubelet/pods/5c966250-ab30-4713-ba4f-c19bd653309d/volumes" Feb 27 07:00:10 crc kubenswrapper[4725]: E0227 07:00:10.705632 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-conmon-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 07:00:15 crc kubenswrapper[4725]: I0227 07:00:15.251896 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:00:15 crc kubenswrapper[4725]: E0227 07:00:15.252474 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:00:20 crc kubenswrapper[4725]: E0227 07:00:20.980831 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-conmon-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d633e54_e2ff_4c65_9cb6_f3ba5279fbde.slice/crio-546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 07:00:22 crc kubenswrapper[4725]: E0227 07:00:22.297673 4725 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openshift-marketplace_community-operators-8nl9m_1d633e54-e2ff-4c65-9cb6-f3ba5279fbde/registry-server/0.log" to get inode usage: stat /var/log/pods/openshift-marketplace_community-operators-8nl9m_1d633e54-e2ff-4c65-9cb6-f3ba5279fbde/registry-server/0.log: no such file or directory Feb 27 07:00:28 crc kubenswrapper[4725]: I0227 07:00:28.252077 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:00:28 crc kubenswrapper[4725]: E0227 07:00:28.252992 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:00:40 crc kubenswrapper[4725]: I0227 07:00:40.252507 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:00:40 crc kubenswrapper[4725]: E0227 07:00:40.253404 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:00:52 crc kubenswrapper[4725]: I0227 07:00:52.264934 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:00:52 crc kubenswrapper[4725]: E0227 07:00:52.265819 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.152910 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29536261-jgbdj"] Feb 27 07:01:00 crc kubenswrapper[4725]: E0227 07:01:00.153737 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f348fec4-84f9-472f-b06d-a053375f2ffb" containerName="oc" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.153751 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f348fec4-84f9-472f-b06d-a053375f2ffb" containerName="oc" Feb 27 07:01:00 crc kubenswrapper[4725]: E0227 07:01:00.153770 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" containerName="collect-profiles" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.153775 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" containerName="collect-profiles" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.153969 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f348fec4-84f9-472f-b06d-a053375f2ffb" containerName="oc" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.153997 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" containerName="collect-profiles" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.154718 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.184931 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29536261-jgbdj"] Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.271341 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4klkd\" (UniqueName: \"kubernetes.io/projected/c53bb79f-c970-4c9e-9a11-c8961e8041ce-kube-api-access-4klkd\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.272104 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-combined-ca-bundle\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.272362 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-fernet-keys\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.272910 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-config-data\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.375040 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-config-data\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.375173 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4klkd\" (UniqueName: \"kubernetes.io/projected/c53bb79f-c970-4c9e-9a11-c8961e8041ce-kube-api-access-4klkd\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.375260 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-combined-ca-bundle\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.375337 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-fernet-keys\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.382424 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-fernet-keys\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.383159 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-config-data\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.385134 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-combined-ca-bundle\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.410406 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4klkd\" (UniqueName: \"kubernetes.io/projected/c53bb79f-c970-4c9e-9a11-c8961e8041ce-kube-api-access-4klkd\") pod \"keystone-cron-29536261-jgbdj\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.477021 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:00 crc kubenswrapper[4725]: I0227 07:01:00.944673 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29536261-jgbdj"] Feb 27 07:01:01 crc kubenswrapper[4725]: I0227 07:01:01.077090 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29536261-jgbdj" event={"ID":"c53bb79f-c970-4c9e-9a11-c8961e8041ce","Type":"ContainerStarted","Data":"ab88eb67ee592a462e3f1dc52335948c158c668bc974a9a1df7290c097e13973"} Feb 27 07:01:02 crc kubenswrapper[4725]: I0227 07:01:02.087399 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29536261-jgbdj" event={"ID":"c53bb79f-c970-4c9e-9a11-c8961e8041ce","Type":"ContainerStarted","Data":"8e4b3a6efc1fa9d08c82b2acc865c752d9adfa4f9d2c121c2205acee03784104"} Feb 27 07:01:02 crc kubenswrapper[4725]: I0227 07:01:02.112729 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29536261-jgbdj" podStartSLOduration=2.112705842 podStartE2EDuration="2.112705842s" podCreationTimestamp="2026-02-27 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 07:01:02.107006231 +0000 UTC m=+3040.569626800" watchObservedRunningTime="2026-02-27 07:01:02.112705842 +0000 UTC m=+3040.575326421" Feb 27 07:01:04 crc kubenswrapper[4725]: I0227 07:01:04.613275 4725 scope.go:117] "RemoveContainer" containerID="e9533316ded9adecbc4eaf3ca53de475bfd984f1ec3585172bd6c81b05b73042" Feb 27 07:01:04 crc kubenswrapper[4725]: I0227 07:01:04.650156 4725 scope.go:117] "RemoveContainer" containerID="9e211eea55d7ea1ef8bff2b561ba8c249722e7ea7aaad8334d07f23ac4026325" Feb 27 07:01:05 crc kubenswrapper[4725]: I0227 07:01:05.130400 4725 generic.go:334] "Generic (PLEG): container finished" podID="c53bb79f-c970-4c9e-9a11-c8961e8041ce" containerID="8e4b3a6efc1fa9d08c82b2acc865c752d9adfa4f9d2c121c2205acee03784104" exitCode=0 Feb 27 07:01:05 crc kubenswrapper[4725]: I0227 07:01:05.130470 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29536261-jgbdj" event={"ID":"c53bb79f-c970-4c9e-9a11-c8961e8041ce","Type":"ContainerDied","Data":"8e4b3a6efc1fa9d08c82b2acc865c752d9adfa4f9d2c121c2205acee03784104"} Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.253183 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:01:06 crc kubenswrapper[4725]: E0227 07:01:06.253874 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.578911 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.716166 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-fernet-keys\") pod \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.716256 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4klkd\" (UniqueName: \"kubernetes.io/projected/c53bb79f-c970-4c9e-9a11-c8961e8041ce-kube-api-access-4klkd\") pod \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.716332 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-combined-ca-bundle\") pod \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.716533 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-config-data\") pod \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\" (UID: \"c53bb79f-c970-4c9e-9a11-c8961e8041ce\") " Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.721680 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c53bb79f-c970-4c9e-9a11-c8961e8041ce" (UID: "c53bb79f-c970-4c9e-9a11-c8961e8041ce"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.722275 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53bb79f-c970-4c9e-9a11-c8961e8041ce-kube-api-access-4klkd" (OuterVolumeSpecName: "kube-api-access-4klkd") pod "c53bb79f-c970-4c9e-9a11-c8961e8041ce" (UID: "c53bb79f-c970-4c9e-9a11-c8961e8041ce"). InnerVolumeSpecName "kube-api-access-4klkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.749442 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c53bb79f-c970-4c9e-9a11-c8961e8041ce" (UID: "c53bb79f-c970-4c9e-9a11-c8961e8041ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.774195 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-config-data" (OuterVolumeSpecName: "config-data") pod "c53bb79f-c970-4c9e-9a11-c8961e8041ce" (UID: "c53bb79f-c970-4c9e-9a11-c8961e8041ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.819204 4725 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.819243 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4klkd\" (UniqueName: \"kubernetes.io/projected/c53bb79f-c970-4c9e-9a11-c8961e8041ce-kube-api-access-4klkd\") on node \"crc\" DevicePath \"\"" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.819254 4725 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 07:01:06 crc kubenswrapper[4725]: I0227 07:01:06.819264 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53bb79f-c970-4c9e-9a11-c8961e8041ce-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 07:01:07 crc kubenswrapper[4725]: I0227 07:01:07.185023 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29536261-jgbdj" event={"ID":"c53bb79f-c970-4c9e-9a11-c8961e8041ce","Type":"ContainerDied","Data":"ab88eb67ee592a462e3f1dc52335948c158c668bc974a9a1df7290c097e13973"} Feb 27 07:01:07 crc kubenswrapper[4725]: I0227 07:01:07.185460 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab88eb67ee592a462e3f1dc52335948c158c668bc974a9a1df7290c097e13973" Feb 27 07:01:07 crc kubenswrapper[4725]: I0227 07:01:07.185142 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29536261-jgbdj" Feb 27 07:01:17 crc kubenswrapper[4725]: I0227 07:01:17.251908 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:01:17 crc kubenswrapper[4725]: E0227 07:01:17.253102 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:01:29 crc kubenswrapper[4725]: I0227 07:01:29.254307 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:01:29 crc kubenswrapper[4725]: E0227 07:01:29.254942 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:01:44 crc kubenswrapper[4725]: I0227 07:01:44.252193 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:01:44 crc kubenswrapper[4725]: E0227 07:01:44.252910 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:01:58 crc kubenswrapper[4725]: I0227 07:01:58.252384 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:01:58 crc kubenswrapper[4725]: E0227 07:01:58.253228 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.170576 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536262-tvfks"] Feb 27 07:02:00 crc kubenswrapper[4725]: E0227 07:02:00.171404 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53bb79f-c970-4c9e-9a11-c8961e8041ce" containerName="keystone-cron" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.171423 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53bb79f-c970-4c9e-9a11-c8961e8041ce" containerName="keystone-cron" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.171730 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53bb79f-c970-4c9e-9a11-c8961e8041ce" containerName="keystone-cron" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.173195 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536262-tvfks" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.176156 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.176416 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.176573 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.185414 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536262-tvfks"] Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.231433 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfpj\" (UniqueName: \"kubernetes.io/projected/0c92e3ce-43ab-4895-8fd1-14c0c1fe3808-kube-api-access-wwfpj\") pod \"auto-csr-approver-29536262-tvfks\" (UID: \"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808\") " pod="openshift-infra/auto-csr-approver-29536262-tvfks" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.333410 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwfpj\" (UniqueName: \"kubernetes.io/projected/0c92e3ce-43ab-4895-8fd1-14c0c1fe3808-kube-api-access-wwfpj\") pod \"auto-csr-approver-29536262-tvfks\" (UID: \"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808\") " pod="openshift-infra/auto-csr-approver-29536262-tvfks" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.359912 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwfpj\" (UniqueName: \"kubernetes.io/projected/0c92e3ce-43ab-4895-8fd1-14c0c1fe3808-kube-api-access-wwfpj\") pod \"auto-csr-approver-29536262-tvfks\" (UID: \"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808\") " pod="openshift-infra/auto-csr-approver-29536262-tvfks" Feb 27 07:02:00 crc kubenswrapper[4725]: I0227 07:02:00.511820 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536262-tvfks" Feb 27 07:02:01 crc kubenswrapper[4725]: I0227 07:02:01.021232 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536262-tvfks"] Feb 27 07:02:01 crc kubenswrapper[4725]: I0227 07:02:01.778607 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536262-tvfks" event={"ID":"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808","Type":"ContainerStarted","Data":"2825b5728686fa102e2c5329fe41ca59886a90c314351b070a45dcca01af0b21"} Feb 27 07:02:02 crc kubenswrapper[4725]: I0227 07:02:02.795382 4725 generic.go:334] "Generic (PLEG): container finished" podID="0c92e3ce-43ab-4895-8fd1-14c0c1fe3808" containerID="57dc0075556bc17b7bb4b9a8e4baf49a52cb7edc77f090e1f3072a18a799c4e4" exitCode=0 Feb 27 07:02:02 crc kubenswrapper[4725]: I0227 07:02:02.795459 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536262-tvfks" event={"ID":"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808","Type":"ContainerDied","Data":"57dc0075556bc17b7bb4b9a8e4baf49a52cb7edc77f090e1f3072a18a799c4e4"} Feb 27 07:02:04 crc kubenswrapper[4725]: I0227 07:02:04.158094 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536262-tvfks" Feb 27 07:02:04 crc kubenswrapper[4725]: I0227 07:02:04.238662 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwfpj\" (UniqueName: \"kubernetes.io/projected/0c92e3ce-43ab-4895-8fd1-14c0c1fe3808-kube-api-access-wwfpj\") pod \"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808\" (UID: \"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808\") " Feb 27 07:02:04 crc kubenswrapper[4725]: I0227 07:02:04.256775 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c92e3ce-43ab-4895-8fd1-14c0c1fe3808-kube-api-access-wwfpj" (OuterVolumeSpecName: "kube-api-access-wwfpj") pod "0c92e3ce-43ab-4895-8fd1-14c0c1fe3808" (UID: "0c92e3ce-43ab-4895-8fd1-14c0c1fe3808"). InnerVolumeSpecName "kube-api-access-wwfpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:02:04 crc kubenswrapper[4725]: I0227 07:02:04.340877 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwfpj\" (UniqueName: \"kubernetes.io/projected/0c92e3ce-43ab-4895-8fd1-14c0c1fe3808-kube-api-access-wwfpj\") on node \"crc\" DevicePath \"\"" Feb 27 07:02:04 crc kubenswrapper[4725]: I0227 07:02:04.814870 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536262-tvfks" event={"ID":"0c92e3ce-43ab-4895-8fd1-14c0c1fe3808","Type":"ContainerDied","Data":"2825b5728686fa102e2c5329fe41ca59886a90c314351b070a45dcca01af0b21"} Feb 27 07:02:04 crc kubenswrapper[4725]: I0227 07:02:04.814908 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2825b5728686fa102e2c5329fe41ca59886a90c314351b070a45dcca01af0b21" Feb 27 07:02:04 crc kubenswrapper[4725]: I0227 07:02:04.814957 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536262-tvfks" Feb 27 07:02:05 crc kubenswrapper[4725]: I0227 07:02:05.232837 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536256-kxtdc"] Feb 27 07:02:05 crc kubenswrapper[4725]: I0227 07:02:05.241472 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536256-kxtdc"] Feb 27 07:02:06 crc kubenswrapper[4725]: I0227 07:02:06.283976 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4784fd46-fab3-4706-b4e9-53818d8889e4" path="/var/lib/kubelet/pods/4784fd46-fab3-4706-b4e9-53818d8889e4/volumes" Feb 27 07:02:10 crc kubenswrapper[4725]: I0227 07:02:10.252149 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:02:10 crc kubenswrapper[4725]: E0227 07:02:10.252629 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:02:22 crc kubenswrapper[4725]: I0227 07:02:22.263459 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:02:22 crc kubenswrapper[4725]: E0227 07:02:22.269562 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:02:35 crc kubenswrapper[4725]: I0227 07:02:35.252561 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:02:35 crc kubenswrapper[4725]: E0227 07:02:35.253522 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:02:50 crc kubenswrapper[4725]: I0227 07:02:50.251732 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:02:50 crc kubenswrapper[4725]: E0227 07:02:50.252726 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:03:02 crc kubenswrapper[4725]: I0227 07:03:02.257760 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:03:02 crc kubenswrapper[4725]: E0227 07:03:02.259077 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:03:04 crc kubenswrapper[4725]: I0227 07:03:04.778974 4725 scope.go:117] "RemoveContainer" containerID="185b997d2e59908445f218b179e89cc851877c18a9f3af276e1a8c5ffd92659b" Feb 27 07:03:15 crc kubenswrapper[4725]: I0227 07:03:15.252745 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:03:15 crc kubenswrapper[4725]: E0227 07:03:15.253980 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:03:27 crc kubenswrapper[4725]: I0227 07:03:27.252614 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:03:27 crc kubenswrapper[4725]: E0227 07:03:27.255620 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:03:41 crc kubenswrapper[4725]: I0227 07:03:41.251360 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:03:41 crc kubenswrapper[4725]: E0227 07:03:41.252175 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:03:53 crc kubenswrapper[4725]: I0227 07:03:53.252621 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:03:53 crc kubenswrapper[4725]: E0227 07:03:53.253644 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.144835 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536264-ktxjg"] Feb 27 07:04:00 crc kubenswrapper[4725]: E0227 07:04:00.145780 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c92e3ce-43ab-4895-8fd1-14c0c1fe3808" containerName="oc" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.145793 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c92e3ce-43ab-4895-8fd1-14c0c1fe3808" containerName="oc" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.146003 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c92e3ce-43ab-4895-8fd1-14c0c1fe3808" containerName="oc" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.146757 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536264-ktxjg" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.156691 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.156713 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.157448 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.159229 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536264-ktxjg"] Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.293217 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqrn8\" (UniqueName: \"kubernetes.io/projected/ccc3820e-2105-4ce3-be43-598af2af2beb-kube-api-access-xqrn8\") pod \"auto-csr-approver-29536264-ktxjg\" (UID: \"ccc3820e-2105-4ce3-be43-598af2af2beb\") " pod="openshift-infra/auto-csr-approver-29536264-ktxjg" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.395151 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqrn8\" (UniqueName: \"kubernetes.io/projected/ccc3820e-2105-4ce3-be43-598af2af2beb-kube-api-access-xqrn8\") pod \"auto-csr-approver-29536264-ktxjg\" (UID: \"ccc3820e-2105-4ce3-be43-598af2af2beb\") " pod="openshift-infra/auto-csr-approver-29536264-ktxjg" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.423750 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqrn8\" (UniqueName: \"kubernetes.io/projected/ccc3820e-2105-4ce3-be43-598af2af2beb-kube-api-access-xqrn8\") pod \"auto-csr-approver-29536264-ktxjg\" (UID: \"ccc3820e-2105-4ce3-be43-598af2af2beb\") " pod="openshift-infra/auto-csr-approver-29536264-ktxjg" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.466792 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536264-ktxjg" Feb 27 07:04:00 crc kubenswrapper[4725]: I0227 07:04:00.954578 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536264-ktxjg"] Feb 27 07:04:01 crc kubenswrapper[4725]: I0227 07:04:01.078997 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536264-ktxjg" event={"ID":"ccc3820e-2105-4ce3-be43-598af2af2beb","Type":"ContainerStarted","Data":"94260e4681afa76c29ca4de580afc9e25eb80fd51d8dfe0e8d95f2bd98cd6b1f"} Feb 27 07:04:03 crc kubenswrapper[4725]: I0227 07:04:03.115709 4725 generic.go:334] "Generic (PLEG): container finished" podID="ccc3820e-2105-4ce3-be43-598af2af2beb" containerID="71571c1166ee97ec13f1122e8982ff874bc1df6a6333451ff44e51e9fe5b9480" exitCode=0 Feb 27 07:04:03 crc kubenswrapper[4725]: I0227 07:04:03.115810 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536264-ktxjg" event={"ID":"ccc3820e-2105-4ce3-be43-598af2af2beb","Type":"ContainerDied","Data":"71571c1166ee97ec13f1122e8982ff874bc1df6a6333451ff44e51e9fe5b9480"} Feb 27 07:04:04 crc kubenswrapper[4725]: I0227 07:04:04.469332 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536264-ktxjg" Feb 27 07:04:04 crc kubenswrapper[4725]: I0227 07:04:04.587784 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqrn8\" (UniqueName: \"kubernetes.io/projected/ccc3820e-2105-4ce3-be43-598af2af2beb-kube-api-access-xqrn8\") pod \"ccc3820e-2105-4ce3-be43-598af2af2beb\" (UID: \"ccc3820e-2105-4ce3-be43-598af2af2beb\") " Feb 27 07:04:04 crc kubenswrapper[4725]: I0227 07:04:04.594825 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc3820e-2105-4ce3-be43-598af2af2beb-kube-api-access-xqrn8" (OuterVolumeSpecName: "kube-api-access-xqrn8") pod "ccc3820e-2105-4ce3-be43-598af2af2beb" (UID: "ccc3820e-2105-4ce3-be43-598af2af2beb"). InnerVolumeSpecName "kube-api-access-xqrn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:04:04 crc kubenswrapper[4725]: I0227 07:04:04.691982 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqrn8\" (UniqueName: \"kubernetes.io/projected/ccc3820e-2105-4ce3-be43-598af2af2beb-kube-api-access-xqrn8\") on node \"crc\" DevicePath \"\"" Feb 27 07:04:05 crc kubenswrapper[4725]: I0227 07:04:05.141466 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536264-ktxjg" event={"ID":"ccc3820e-2105-4ce3-be43-598af2af2beb","Type":"ContainerDied","Data":"94260e4681afa76c29ca4de580afc9e25eb80fd51d8dfe0e8d95f2bd98cd6b1f"} Feb 27 07:04:05 crc kubenswrapper[4725]: I0227 07:04:05.141508 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94260e4681afa76c29ca4de580afc9e25eb80fd51d8dfe0e8d95f2bd98cd6b1f" Feb 27 07:04:05 crc kubenswrapper[4725]: I0227 07:04:05.141541 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536264-ktxjg" Feb 27 07:04:05 crc kubenswrapper[4725]: I0227 07:04:05.251399 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:04:05 crc kubenswrapper[4725]: E0227 07:04:05.251780 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:04:05 crc kubenswrapper[4725]: I0227 07:04:05.548614 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536258-hddlw"] Feb 27 07:04:05 crc kubenswrapper[4725]: I0227 07:04:05.557677 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536258-hddlw"] Feb 27 07:04:06 crc kubenswrapper[4725]: I0227 07:04:06.267693 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18a4b60-644f-4d9b-8679-c1034e916b8e" path="/var/lib/kubelet/pods/d18a4b60-644f-4d9b-8679-c1034e916b8e/volumes" Feb 27 07:04:19 crc kubenswrapper[4725]: I0227 07:04:19.252099 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:04:19 crc kubenswrapper[4725]: E0227 07:04:19.252767 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:04:32 crc kubenswrapper[4725]: I0227 07:04:32.261670 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:04:32 crc kubenswrapper[4725]: E0227 07:04:32.262667 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:04:45 crc kubenswrapper[4725]: I0227 07:04:45.253143 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:04:46 crc kubenswrapper[4725]: I0227 07:04:46.171401 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"e601c5ecb917d019ad7cd04e1aba76d1fc6030f46e6fcfdadfb5f33b258056ba"} Feb 27 07:05:04 crc kubenswrapper[4725]: I0227 07:05:04.889512 4725 scope.go:117] "RemoveContainer" containerID="546129c485f1667543c8069c1059f0d6e072d4aa15ff2ab0bba49e024f191da4" Feb 27 07:05:04 crc kubenswrapper[4725]: I0227 07:05:04.934348 4725 scope.go:117] "RemoveContainer" containerID="2c22253d1bec1037ab6f4cbdc205aea240c1625bf9b10104212cb4a7432cba7f" Feb 27 07:05:04 crc kubenswrapper[4725]: I0227 07:05:04.984381 4725 scope.go:117] "RemoveContainer" containerID="7f70efae7f120c37b51032654a72bcba42ac69833a082020e6735663b594690d" Feb 27 07:05:05 crc kubenswrapper[4725]: I0227 07:05:05.022904 4725 scope.go:117] "RemoveContainer" containerID="0d7ef55a496e1475c14594177f285b6f753aef6046fe27e738dd29adc76d793a" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.508904 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w6hgq"] Feb 27 07:05:46 crc kubenswrapper[4725]: E0227 07:05:46.513794 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc3820e-2105-4ce3-be43-598af2af2beb" containerName="oc" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.513826 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc3820e-2105-4ce3-be43-598af2af2beb" containerName="oc" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.514137 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc3820e-2105-4ce3-be43-598af2af2beb" containerName="oc" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.515913 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.532776 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6hgq"] Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.592296 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-utilities\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.592424 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-catalog-content\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.592622 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7w4l\" (UniqueName: \"kubernetes.io/projected/e46f9f0b-cafb-465a-9244-d0d8a70807aa-kube-api-access-h7w4l\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.694418 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7w4l\" (UniqueName: \"kubernetes.io/projected/e46f9f0b-cafb-465a-9244-d0d8a70807aa-kube-api-access-h7w4l\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.694473 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-utilities\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.694529 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-catalog-content\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.695029 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-utilities\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.695037 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-catalog-content\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.717741 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7w4l\" (UniqueName: \"kubernetes.io/projected/e46f9f0b-cafb-465a-9244-d0d8a70807aa-kube-api-access-h7w4l\") pod \"redhat-marketplace-w6hgq\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:46 crc kubenswrapper[4725]: I0227 07:05:46.852532 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:47 crc kubenswrapper[4725]: I0227 07:05:47.392028 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6hgq"] Feb 27 07:05:47 crc kubenswrapper[4725]: I0227 07:05:47.836384 4725 generic.go:334] "Generic (PLEG): container finished" podID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerID="aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348" exitCode=0 Feb 27 07:05:47 crc kubenswrapper[4725]: I0227 07:05:47.836618 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6hgq" event={"ID":"e46f9f0b-cafb-465a-9244-d0d8a70807aa","Type":"ContainerDied","Data":"aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348"} Feb 27 07:05:47 crc kubenswrapper[4725]: I0227 07:05:47.836749 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6hgq" event={"ID":"e46f9f0b-cafb-465a-9244-d0d8a70807aa","Type":"ContainerStarted","Data":"820b897962e4d9a50b95203a5ae1324c82c911d74b422fe221e36ea3e4928749"} Feb 27 07:05:47 crc kubenswrapper[4725]: I0227 07:05:47.838885 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:05:48 crc kubenswrapper[4725]: I0227 07:05:48.849769 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6hgq" event={"ID":"e46f9f0b-cafb-465a-9244-d0d8a70807aa","Type":"ContainerStarted","Data":"454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f"} Feb 27 07:05:49 crc kubenswrapper[4725]: I0227 07:05:49.863339 4725 generic.go:334] "Generic (PLEG): container finished" podID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerID="454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f" exitCode=0 Feb 27 07:05:49 crc kubenswrapper[4725]: I0227 07:05:49.863431 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6hgq" event={"ID":"e46f9f0b-cafb-465a-9244-d0d8a70807aa","Type":"ContainerDied","Data":"454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f"} Feb 27 07:05:51 crc kubenswrapper[4725]: I0227 07:05:51.887916 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6hgq" event={"ID":"e46f9f0b-cafb-465a-9244-d0d8a70807aa","Type":"ContainerStarted","Data":"0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433"} Feb 27 07:05:51 crc kubenswrapper[4725]: I0227 07:05:51.914315 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w6hgq" podStartSLOduration=3.033677276 podStartE2EDuration="5.914259346s" podCreationTimestamp="2026-02-27 07:05:46 +0000 UTC" firstStartedPulling="2026-02-27 07:05:47.838613982 +0000 UTC m=+3326.301234551" lastFinishedPulling="2026-02-27 07:05:50.719196052 +0000 UTC m=+3329.181816621" observedRunningTime="2026-02-27 07:05:51.908182644 +0000 UTC m=+3330.370803253" watchObservedRunningTime="2026-02-27 07:05:51.914259346 +0000 UTC m=+3330.376879925" Feb 27 07:05:56 crc kubenswrapper[4725]: I0227 07:05:56.853197 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:56 crc kubenswrapper[4725]: I0227 07:05:56.854054 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:56 crc kubenswrapper[4725]: I0227 07:05:56.930413 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:57 crc kubenswrapper[4725]: I0227 07:05:57.009848 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:57 crc kubenswrapper[4725]: I0227 07:05:57.176866 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6hgq"] Feb 27 07:05:58 crc kubenswrapper[4725]: I0227 07:05:58.967483 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w6hgq" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="registry-server" containerID="cri-o://0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433" gracePeriod=2 Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.487789 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.614607 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-utilities\") pod \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.614879 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7w4l\" (UniqueName: \"kubernetes.io/projected/e46f9f0b-cafb-465a-9244-d0d8a70807aa-kube-api-access-h7w4l\") pod \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.615159 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-catalog-content\") pod \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\" (UID: \"e46f9f0b-cafb-465a-9244-d0d8a70807aa\") " Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.615583 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-utilities" (OuterVolumeSpecName: "utilities") pod "e46f9f0b-cafb-465a-9244-d0d8a70807aa" (UID: "e46f9f0b-cafb-465a-9244-d0d8a70807aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.615728 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.633185 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46f9f0b-cafb-465a-9244-d0d8a70807aa-kube-api-access-h7w4l" (OuterVolumeSpecName: "kube-api-access-h7w4l") pod "e46f9f0b-cafb-465a-9244-d0d8a70807aa" (UID: "e46f9f0b-cafb-465a-9244-d0d8a70807aa"). InnerVolumeSpecName "kube-api-access-h7w4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.644270 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e46f9f0b-cafb-465a-9244-d0d8a70807aa" (UID: "e46f9f0b-cafb-465a-9244-d0d8a70807aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.718089 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e46f9f0b-cafb-465a-9244-d0d8a70807aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.718122 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7w4l\" (UniqueName: \"kubernetes.io/projected/e46f9f0b-cafb-465a-9244-d0d8a70807aa-kube-api-access-h7w4l\") on node \"crc\" DevicePath \"\"" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.979110 4725 generic.go:334] "Generic (PLEG): container finished" podID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerID="0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433" exitCode=0 Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.979166 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6hgq" event={"ID":"e46f9f0b-cafb-465a-9244-d0d8a70807aa","Type":"ContainerDied","Data":"0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433"} Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.979195 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6hgq" event={"ID":"e46f9f0b-cafb-465a-9244-d0d8a70807aa","Type":"ContainerDied","Data":"820b897962e4d9a50b95203a5ae1324c82c911d74b422fe221e36ea3e4928749"} Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.979215 4725 scope.go:117] "RemoveContainer" containerID="0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433" Feb 27 07:05:59 crc kubenswrapper[4725]: I0227 07:05:59.979382 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6hgq" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.017724 4725 scope.go:117] "RemoveContainer" containerID="454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.031052 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6hgq"] Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.044756 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6hgq"] Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.053511 4725 scope.go:117] "RemoveContainer" containerID="aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.091237 4725 scope.go:117] "RemoveContainer" containerID="0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433" Feb 27 07:06:00 crc kubenswrapper[4725]: E0227 07:06:00.091777 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433\": container with ID starting with 0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433 not found: ID does not exist" containerID="0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.091824 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433"} err="failed to get container status \"0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433\": rpc error: code = NotFound desc = could not find container \"0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433\": container with ID starting with 0ec20e1fb35b06b0b9bc8dfe4226d7601f148cfa98a08bee7da394d2a2cae433 not found: ID does not exist" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.091851 4725 scope.go:117] "RemoveContainer" containerID="454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f" Feb 27 07:06:00 crc kubenswrapper[4725]: E0227 07:06:00.092146 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f\": container with ID starting with 454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f not found: ID does not exist" containerID="454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.092173 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f"} err="failed to get container status \"454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f\": rpc error: code = NotFound desc = could not find container \"454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f\": container with ID starting with 454fa406877bbd5c4c5251a2082764f0cb6f638d1b8095f7de475053aee5460f not found: ID does not exist" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.092191 4725 scope.go:117] "RemoveContainer" containerID="aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348" Feb 27 07:06:00 crc kubenswrapper[4725]: E0227 07:06:00.092459 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348\": container with ID starting with aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348 not found: ID does not exist" containerID="aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.092491 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348"} err="failed to get container status \"aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348\": rpc error: code = NotFound desc = could not find container \"aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348\": container with ID starting with aee052278215e6934067f31f46ba2a1d6032bf50687b541666ac13bc1ce51348 not found: ID does not exist" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.168669 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536266-lkv6f"] Feb 27 07:06:00 crc kubenswrapper[4725]: E0227 07:06:00.169214 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="extract-utilities" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.169237 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="extract-utilities" Feb 27 07:06:00 crc kubenswrapper[4725]: E0227 07:06:00.169280 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="extract-content" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.169307 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="extract-content" Feb 27 07:06:00 crc kubenswrapper[4725]: E0227 07:06:00.169323 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="registry-server" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.169331 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="registry-server" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.169612 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" containerName="registry-server" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.170498 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536266-lkv6f" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.172261 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.172794 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.175616 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.183769 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536266-lkv6f"] Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.267122 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46f9f0b-cafb-465a-9244-d0d8a70807aa" path="/var/lib/kubelet/pods/e46f9f0b-cafb-465a-9244-d0d8a70807aa/volumes" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.328658 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txmsm\" (UniqueName: \"kubernetes.io/projected/f81f47c6-3359-48e1-9dd5-68dcd62c5996-kube-api-access-txmsm\") pod \"auto-csr-approver-29536266-lkv6f\" (UID: \"f81f47c6-3359-48e1-9dd5-68dcd62c5996\") " pod="openshift-infra/auto-csr-approver-29536266-lkv6f" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.431129 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txmsm\" (UniqueName: \"kubernetes.io/projected/f81f47c6-3359-48e1-9dd5-68dcd62c5996-kube-api-access-txmsm\") pod \"auto-csr-approver-29536266-lkv6f\" (UID: \"f81f47c6-3359-48e1-9dd5-68dcd62c5996\") " pod="openshift-infra/auto-csr-approver-29536266-lkv6f" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.457181 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txmsm\" (UniqueName: \"kubernetes.io/projected/f81f47c6-3359-48e1-9dd5-68dcd62c5996-kube-api-access-txmsm\") pod \"auto-csr-approver-29536266-lkv6f\" (UID: \"f81f47c6-3359-48e1-9dd5-68dcd62c5996\") " pod="openshift-infra/auto-csr-approver-29536266-lkv6f" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.490718 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536266-lkv6f" Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.970180 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536266-lkv6f"] Feb 27 07:06:00 crc kubenswrapper[4725]: W0227 07:06:00.974170 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf81f47c6_3359_48e1_9dd5_68dcd62c5996.slice/crio-a3ef8d8b164c886ac5b0bf6fa243eea37e6a27e96b8f74c4715f61245498877e WatchSource:0}: Error finding container a3ef8d8b164c886ac5b0bf6fa243eea37e6a27e96b8f74c4715f61245498877e: Status 404 returned error can't find the container with id a3ef8d8b164c886ac5b0bf6fa243eea37e6a27e96b8f74c4715f61245498877e Feb 27 07:06:00 crc kubenswrapper[4725]: I0227 07:06:00.991854 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536266-lkv6f" event={"ID":"f81f47c6-3359-48e1-9dd5-68dcd62c5996","Type":"ContainerStarted","Data":"a3ef8d8b164c886ac5b0bf6fa243eea37e6a27e96b8f74c4715f61245498877e"} Feb 27 07:06:03 crc kubenswrapper[4725]: I0227 07:06:03.014305 4725 generic.go:334] "Generic (PLEG): container finished" podID="f81f47c6-3359-48e1-9dd5-68dcd62c5996" containerID="9f8b5faf75da22675f6f9a2316127469fef8980e6eff4a813f852cdc42877e73" exitCode=0 Feb 27 07:06:03 crc kubenswrapper[4725]: I0227 07:06:03.014355 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536266-lkv6f" event={"ID":"f81f47c6-3359-48e1-9dd5-68dcd62c5996","Type":"ContainerDied","Data":"9f8b5faf75da22675f6f9a2316127469fef8980e6eff4a813f852cdc42877e73"} Feb 27 07:06:04 crc kubenswrapper[4725]: I0227 07:06:04.426479 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536266-lkv6f" Feb 27 07:06:04 crc kubenswrapper[4725]: I0227 07:06:04.528482 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txmsm\" (UniqueName: \"kubernetes.io/projected/f81f47c6-3359-48e1-9dd5-68dcd62c5996-kube-api-access-txmsm\") pod \"f81f47c6-3359-48e1-9dd5-68dcd62c5996\" (UID: \"f81f47c6-3359-48e1-9dd5-68dcd62c5996\") " Feb 27 07:06:04 crc kubenswrapper[4725]: I0227 07:06:04.534567 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81f47c6-3359-48e1-9dd5-68dcd62c5996-kube-api-access-txmsm" (OuterVolumeSpecName: "kube-api-access-txmsm") pod "f81f47c6-3359-48e1-9dd5-68dcd62c5996" (UID: "f81f47c6-3359-48e1-9dd5-68dcd62c5996"). InnerVolumeSpecName "kube-api-access-txmsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:06:04 crc kubenswrapper[4725]: I0227 07:06:04.630858 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txmsm\" (UniqueName: \"kubernetes.io/projected/f81f47c6-3359-48e1-9dd5-68dcd62c5996-kube-api-access-txmsm\") on node \"crc\" DevicePath \"\"" Feb 27 07:06:05 crc kubenswrapper[4725]: I0227 07:06:05.041116 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536266-lkv6f" event={"ID":"f81f47c6-3359-48e1-9dd5-68dcd62c5996","Type":"ContainerDied","Data":"a3ef8d8b164c886ac5b0bf6fa243eea37e6a27e96b8f74c4715f61245498877e"} Feb 27 07:06:05 crc kubenswrapper[4725]: I0227 07:06:05.041176 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ef8d8b164c886ac5b0bf6fa243eea37e6a27e96b8f74c4715f61245498877e" Feb 27 07:06:05 crc kubenswrapper[4725]: I0227 07:06:05.041244 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536266-lkv6f" Feb 27 07:06:05 crc kubenswrapper[4725]: I0227 07:06:05.495051 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536260-g7r92"] Feb 27 07:06:05 crc kubenswrapper[4725]: I0227 07:06:05.505127 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536260-g7r92"] Feb 27 07:06:06 crc kubenswrapper[4725]: I0227 07:06:06.268447 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f348fec4-84f9-472f-b06d-a053375f2ffb" path="/var/lib/kubelet/pods/f348fec4-84f9-472f-b06d-a053375f2ffb/volumes" Feb 27 07:07:02 crc kubenswrapper[4725]: I0227 07:07:02.554719 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:07:02 crc kubenswrapper[4725]: I0227 07:07:02.555257 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:07:05 crc kubenswrapper[4725]: I0227 07:07:05.150496 4725 scope.go:117] "RemoveContainer" containerID="e0df7edf24f707ce5134d77c1f9dfae348dfeeb04e24ebfdddc38ada58313cfa" Feb 27 07:07:32 crc kubenswrapper[4725]: I0227 07:07:32.554372 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:07:32 crc kubenswrapper[4725]: I0227 07:07:32.555036 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.170650 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dprnp"] Feb 27 07:07:46 crc kubenswrapper[4725]: E0227 07:07:46.171719 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81f47c6-3359-48e1-9dd5-68dcd62c5996" containerName="oc" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.171734 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81f47c6-3359-48e1-9dd5-68dcd62c5996" containerName="oc" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.171979 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81f47c6-3359-48e1-9dd5-68dcd62c5996" containerName="oc" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.173758 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.197639 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dprnp"] Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.375798 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-catalog-content\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.376323 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-utilities\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.376609 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkl4t\" (UniqueName: \"kubernetes.io/projected/6904f022-ec36-4e23-8e25-c189a93622c2-kube-api-access-xkl4t\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.479142 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-catalog-content\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.479358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-utilities\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.479406 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkl4t\" (UniqueName: \"kubernetes.io/projected/6904f022-ec36-4e23-8e25-c189a93622c2-kube-api-access-xkl4t\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.479668 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-catalog-content\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.479746 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-utilities\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.503001 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkl4t\" (UniqueName: \"kubernetes.io/projected/6904f022-ec36-4e23-8e25-c189a93622c2-kube-api-access-xkl4t\") pod \"redhat-operators-dprnp\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:46 crc kubenswrapper[4725]: I0227 07:07:46.802603 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:47 crc kubenswrapper[4725]: I0227 07:07:47.342217 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dprnp"] Feb 27 07:07:48 crc kubenswrapper[4725]: I0227 07:07:48.140835 4725 generic.go:334] "Generic (PLEG): container finished" podID="6904f022-ec36-4e23-8e25-c189a93622c2" containerID="6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a" exitCode=0 Feb 27 07:07:48 crc kubenswrapper[4725]: I0227 07:07:48.140895 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dprnp" event={"ID":"6904f022-ec36-4e23-8e25-c189a93622c2","Type":"ContainerDied","Data":"6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a"} Feb 27 07:07:48 crc kubenswrapper[4725]: I0227 07:07:48.140945 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dprnp" event={"ID":"6904f022-ec36-4e23-8e25-c189a93622c2","Type":"ContainerStarted","Data":"f4f75e87e5a0993d8e7fdb5c0c34022b937fe830e4df9c263dc3aa46d54f7520"} Feb 27 07:07:49 crc kubenswrapper[4725]: I0227 07:07:49.154873 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dprnp" event={"ID":"6904f022-ec36-4e23-8e25-c189a93622c2","Type":"ContainerStarted","Data":"5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e"} Feb 27 07:07:54 crc kubenswrapper[4725]: I0227 07:07:54.211157 4725 generic.go:334] "Generic (PLEG): container finished" podID="6904f022-ec36-4e23-8e25-c189a93622c2" containerID="5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e" exitCode=0 Feb 27 07:07:54 crc kubenswrapper[4725]: I0227 07:07:54.211233 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dprnp" event={"ID":"6904f022-ec36-4e23-8e25-c189a93622c2","Type":"ContainerDied","Data":"5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e"} Feb 27 07:07:55 crc kubenswrapper[4725]: I0227 07:07:55.221972 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dprnp" event={"ID":"6904f022-ec36-4e23-8e25-c189a93622c2","Type":"ContainerStarted","Data":"540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70"} Feb 27 07:07:55 crc kubenswrapper[4725]: I0227 07:07:55.249441 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dprnp" podStartSLOduration=2.435792674 podStartE2EDuration="9.249422393s" podCreationTimestamp="2026-02-27 07:07:46 +0000 UTC" firstStartedPulling="2026-02-27 07:07:48.14303644 +0000 UTC m=+3446.605657009" lastFinishedPulling="2026-02-27 07:07:54.956666159 +0000 UTC m=+3453.419286728" observedRunningTime="2026-02-27 07:07:55.238928117 +0000 UTC m=+3453.701548696" watchObservedRunningTime="2026-02-27 07:07:55.249422393 +0000 UTC m=+3453.712042952" Feb 27 07:07:56 crc kubenswrapper[4725]: I0227 07:07:56.803558 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:56 crc kubenswrapper[4725]: I0227 07:07:56.803847 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:07:57 crc kubenswrapper[4725]: I0227 07:07:57.873951 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dprnp" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="registry-server" probeResult="failure" output=< Feb 27 07:07:57 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:07:57 crc kubenswrapper[4725]: > Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.150069 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536268-zknfx"] Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.151918 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536268-zknfx" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.154006 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.154258 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.155431 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.174757 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536268-zknfx"] Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.267827 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgjc\" (UniqueName: \"kubernetes.io/projected/b76c3c4c-5b59-4736-9222-90051a209498-kube-api-access-ktgjc\") pod \"auto-csr-approver-29536268-zknfx\" (UID: \"b76c3c4c-5b59-4736-9222-90051a209498\") " pod="openshift-infra/auto-csr-approver-29536268-zknfx" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.369708 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgjc\" (UniqueName: \"kubernetes.io/projected/b76c3c4c-5b59-4736-9222-90051a209498-kube-api-access-ktgjc\") pod \"auto-csr-approver-29536268-zknfx\" (UID: \"b76c3c4c-5b59-4736-9222-90051a209498\") " pod="openshift-infra/auto-csr-approver-29536268-zknfx" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.402061 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgjc\" (UniqueName: \"kubernetes.io/projected/b76c3c4c-5b59-4736-9222-90051a209498-kube-api-access-ktgjc\") pod \"auto-csr-approver-29536268-zknfx\" (UID: \"b76c3c4c-5b59-4736-9222-90051a209498\") " pod="openshift-infra/auto-csr-approver-29536268-zknfx" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.471905 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536268-zknfx" Feb 27 07:08:00 crc kubenswrapper[4725]: I0227 07:08:00.981552 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536268-zknfx"] Feb 27 07:08:00 crc kubenswrapper[4725]: W0227 07:08:00.987929 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76c3c4c_5b59_4736_9222_90051a209498.slice/crio-74d08a61a96284448dadc450976a8edb737a161dd0b6a7f1012148e6814a6d82 WatchSource:0}: Error finding container 74d08a61a96284448dadc450976a8edb737a161dd0b6a7f1012148e6814a6d82: Status 404 returned error can't find the container with id 74d08a61a96284448dadc450976a8edb737a161dd0b6a7f1012148e6814a6d82 Feb 27 07:08:01 crc kubenswrapper[4725]: I0227 07:08:01.289923 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536268-zknfx" event={"ID":"b76c3c4c-5b59-4736-9222-90051a209498","Type":"ContainerStarted","Data":"74d08a61a96284448dadc450976a8edb737a161dd0b6a7f1012148e6814a6d82"} Feb 27 07:08:02 crc kubenswrapper[4725]: I0227 07:08:02.573147 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:08:02 crc kubenswrapper[4725]: I0227 07:08:02.573812 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:08:02 crc kubenswrapper[4725]: I0227 07:08:02.573863 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:08:02 crc kubenswrapper[4725]: I0227 07:08:02.574717 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e601c5ecb917d019ad7cd04e1aba76d1fc6030f46e6fcfdadfb5f33b258056ba"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:08:02 crc kubenswrapper[4725]: I0227 07:08:02.574774 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://e601c5ecb917d019ad7cd04e1aba76d1fc6030f46e6fcfdadfb5f33b258056ba" gracePeriod=600 Feb 27 07:08:03 crc kubenswrapper[4725]: I0227 07:08:03.327584 4725 generic.go:334] "Generic (PLEG): container finished" podID="b76c3c4c-5b59-4736-9222-90051a209498" containerID="288d3616adda00da859bc2eb52f1be733f94f7fdec6223728c6bddbcf10f2db4" exitCode=0 Feb 27 07:08:03 crc kubenswrapper[4725]: I0227 07:08:03.327627 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536268-zknfx" event={"ID":"b76c3c4c-5b59-4736-9222-90051a209498","Type":"ContainerDied","Data":"288d3616adda00da859bc2eb52f1be733f94f7fdec6223728c6bddbcf10f2db4"} Feb 27 07:08:03 crc kubenswrapper[4725]: I0227 07:08:03.330792 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="e601c5ecb917d019ad7cd04e1aba76d1fc6030f46e6fcfdadfb5f33b258056ba" exitCode=0 Feb 27 07:08:03 crc kubenswrapper[4725]: I0227 07:08:03.330851 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"e601c5ecb917d019ad7cd04e1aba76d1fc6030f46e6fcfdadfb5f33b258056ba"} Feb 27 07:08:03 crc kubenswrapper[4725]: I0227 07:08:03.330887 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc"} Feb 27 07:08:03 crc kubenswrapper[4725]: I0227 07:08:03.330905 4725 scope.go:117] "RemoveContainer" containerID="d5707dc87819b93df4bcf90e010a262745c46da8cea2952f79811c7c7aab49fa" Feb 27 07:08:04 crc kubenswrapper[4725]: I0227 07:08:04.791695 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536268-zknfx" Feb 27 07:08:04 crc kubenswrapper[4725]: I0227 07:08:04.862635 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktgjc\" (UniqueName: \"kubernetes.io/projected/b76c3c4c-5b59-4736-9222-90051a209498-kube-api-access-ktgjc\") pod \"b76c3c4c-5b59-4736-9222-90051a209498\" (UID: \"b76c3c4c-5b59-4736-9222-90051a209498\") " Feb 27 07:08:04 crc kubenswrapper[4725]: I0227 07:08:04.877721 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76c3c4c-5b59-4736-9222-90051a209498-kube-api-access-ktgjc" (OuterVolumeSpecName: "kube-api-access-ktgjc") pod "b76c3c4c-5b59-4736-9222-90051a209498" (UID: "b76c3c4c-5b59-4736-9222-90051a209498"). InnerVolumeSpecName "kube-api-access-ktgjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:08:04 crc kubenswrapper[4725]: I0227 07:08:04.965579 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktgjc\" (UniqueName: \"kubernetes.io/projected/b76c3c4c-5b59-4736-9222-90051a209498-kube-api-access-ktgjc\") on node \"crc\" DevicePath \"\"" Feb 27 07:08:05 crc kubenswrapper[4725]: I0227 07:08:05.356608 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536268-zknfx" event={"ID":"b76c3c4c-5b59-4736-9222-90051a209498","Type":"ContainerDied","Data":"74d08a61a96284448dadc450976a8edb737a161dd0b6a7f1012148e6814a6d82"} Feb 27 07:08:05 crc kubenswrapper[4725]: I0227 07:08:05.356648 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d08a61a96284448dadc450976a8edb737a161dd0b6a7f1012148e6814a6d82" Feb 27 07:08:05 crc kubenswrapper[4725]: I0227 07:08:05.356699 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536268-zknfx" Feb 27 07:08:05 crc kubenswrapper[4725]: I0227 07:08:05.884463 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536262-tvfks"] Feb 27 07:08:05 crc kubenswrapper[4725]: I0227 07:08:05.894877 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536262-tvfks"] Feb 27 07:08:06 crc kubenswrapper[4725]: I0227 07:08:06.264239 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c92e3ce-43ab-4895-8fd1-14c0c1fe3808" path="/var/lib/kubelet/pods/0c92e3ce-43ab-4895-8fd1-14c0c1fe3808/volumes" Feb 27 07:08:07 crc kubenswrapper[4725]: I0227 07:08:07.865224 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dprnp" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="registry-server" probeResult="failure" output=< Feb 27 07:08:07 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:08:07 crc kubenswrapper[4725]: > Feb 27 07:08:16 crc kubenswrapper[4725]: I0227 07:08:16.918753 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:08:17 crc kubenswrapper[4725]: I0227 07:08:17.037591 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:08:17 crc kubenswrapper[4725]: I0227 07:08:17.370614 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dprnp"] Feb 27 07:08:18 crc kubenswrapper[4725]: I0227 07:08:18.466821 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dprnp" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="registry-server" containerID="cri-o://540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70" gracePeriod=2 Feb 27 07:08:18 crc kubenswrapper[4725]: I0227 07:08:18.953253 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.051379 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-utilities\") pod \"6904f022-ec36-4e23-8e25-c189a93622c2\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.051442 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-catalog-content\") pod \"6904f022-ec36-4e23-8e25-c189a93622c2\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.051582 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkl4t\" (UniqueName: \"kubernetes.io/projected/6904f022-ec36-4e23-8e25-c189a93622c2-kube-api-access-xkl4t\") pod \"6904f022-ec36-4e23-8e25-c189a93622c2\" (UID: \"6904f022-ec36-4e23-8e25-c189a93622c2\") " Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.052513 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-utilities" (OuterVolumeSpecName: "utilities") pod "6904f022-ec36-4e23-8e25-c189a93622c2" (UID: "6904f022-ec36-4e23-8e25-c189a93622c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.058592 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6904f022-ec36-4e23-8e25-c189a93622c2-kube-api-access-xkl4t" (OuterVolumeSpecName: "kube-api-access-xkl4t") pod "6904f022-ec36-4e23-8e25-c189a93622c2" (UID: "6904f022-ec36-4e23-8e25-c189a93622c2"). InnerVolumeSpecName "kube-api-access-xkl4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.154131 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.154381 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkl4t\" (UniqueName: \"kubernetes.io/projected/6904f022-ec36-4e23-8e25-c189a93622c2-kube-api-access-xkl4t\") on node \"crc\" DevicePath \"\"" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.202609 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6904f022-ec36-4e23-8e25-c189a93622c2" (UID: "6904f022-ec36-4e23-8e25-c189a93622c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.255367 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6904f022-ec36-4e23-8e25-c189a93622c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.480460 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dprnp" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.480480 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dprnp" event={"ID":"6904f022-ec36-4e23-8e25-c189a93622c2","Type":"ContainerDied","Data":"540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70"} Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.480524 4725 scope.go:117] "RemoveContainer" containerID="540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.480440 4725 generic.go:334] "Generic (PLEG): container finished" podID="6904f022-ec36-4e23-8e25-c189a93622c2" containerID="540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70" exitCode=0 Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.480576 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dprnp" event={"ID":"6904f022-ec36-4e23-8e25-c189a93622c2","Type":"ContainerDied","Data":"f4f75e87e5a0993d8e7fdb5c0c34022b937fe830e4df9c263dc3aa46d54f7520"} Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.501037 4725 scope.go:117] "RemoveContainer" containerID="5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.518364 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dprnp"] Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.522606 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dprnp"] Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.544456 4725 scope.go:117] "RemoveContainer" containerID="6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.585372 4725 scope.go:117] "RemoveContainer" containerID="540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70" Feb 27 07:08:19 crc kubenswrapper[4725]: E0227 07:08:19.588029 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70\": container with ID starting with 540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70 not found: ID does not exist" containerID="540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.588114 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70"} err="failed to get container status \"540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70\": rpc error: code = NotFound desc = could not find container \"540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70\": container with ID starting with 540c98844f624322082dcd95db2c44d2d0fdcd9ad1a7d13944923e00a24b6b70 not found: ID does not exist" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.588139 4725 scope.go:117] "RemoveContainer" containerID="5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e" Feb 27 07:08:19 crc kubenswrapper[4725]: E0227 07:08:19.589380 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e\": container with ID starting with 5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e not found: ID does not exist" containerID="5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.589412 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e"} err="failed to get container status \"5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e\": rpc error: code = NotFound desc = could not find container \"5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e\": container with ID starting with 5bb9022ac9db8d5ec922ba3274db9ae9840c957cda80161adafdb400026fd98e not found: ID does not exist" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.589431 4725 scope.go:117] "RemoveContainer" containerID="6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a" Feb 27 07:08:19 crc kubenswrapper[4725]: E0227 07:08:19.594638 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a\": container with ID starting with 6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a not found: ID does not exist" containerID="6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a" Feb 27 07:08:19 crc kubenswrapper[4725]: I0227 07:08:19.594669 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a"} err="failed to get container status \"6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a\": rpc error: code = NotFound desc = could not find container \"6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a\": container with ID starting with 6bb668ccded0d33b401cbfcf9ecfa9c172ef994525394507ca4b7341cd4f3d2a not found: ID does not exist" Feb 27 07:08:20 crc kubenswrapper[4725]: I0227 07:08:20.263943 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" path="/var/lib/kubelet/pods/6904f022-ec36-4e23-8e25-c189a93622c2/volumes" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.572296 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hzrvs"] Feb 27 07:09:03 crc kubenswrapper[4725]: E0227 07:09:03.574210 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76c3c4c-5b59-4736-9222-90051a209498" containerName="oc" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.574346 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76c3c4c-5b59-4736-9222-90051a209498" containerName="oc" Feb 27 07:09:03 crc kubenswrapper[4725]: E0227 07:09:03.574453 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="registry-server" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.574536 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="registry-server" Feb 27 07:09:03 crc kubenswrapper[4725]: E0227 07:09:03.574666 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="extract-utilities" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.574751 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="extract-utilities" Feb 27 07:09:03 crc kubenswrapper[4725]: E0227 07:09:03.574845 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="extract-content" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.574908 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="extract-content" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.575197 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6904f022-ec36-4e23-8e25-c189a93622c2" containerName="registry-server" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.575285 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76c3c4c-5b59-4736-9222-90051a209498" containerName="oc" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.576740 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.584023 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-utilities\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.584558 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-catalog-content\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.584847 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6gpj\" (UniqueName: \"kubernetes.io/projected/227d2548-27eb-4d75-9e74-56d0076775b2-kube-api-access-l6gpj\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.589663 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hzrvs"] Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.686526 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-utilities\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.686627 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-catalog-content\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.686663 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6gpj\" (UniqueName: \"kubernetes.io/projected/227d2548-27eb-4d75-9e74-56d0076775b2-kube-api-access-l6gpj\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.687416 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-utilities\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.687626 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-catalog-content\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.717357 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6gpj\" (UniqueName: \"kubernetes.io/projected/227d2548-27eb-4d75-9e74-56d0076775b2-kube-api-access-l6gpj\") pod \"certified-operators-hzrvs\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:03 crc kubenswrapper[4725]: I0227 07:09:03.897730 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:04 crc kubenswrapper[4725]: I0227 07:09:04.434766 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hzrvs"] Feb 27 07:09:04 crc kubenswrapper[4725]: I0227 07:09:04.955723 4725 generic.go:334] "Generic (PLEG): container finished" podID="227d2548-27eb-4d75-9e74-56d0076775b2" containerID="54a809ae459a46a7be0ac03f6e5370358c19c2aa1a099c797ff7f0d6a6da70e0" exitCode=0 Feb 27 07:09:04 crc kubenswrapper[4725]: I0227 07:09:04.957158 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzrvs" event={"ID":"227d2548-27eb-4d75-9e74-56d0076775b2","Type":"ContainerDied","Data":"54a809ae459a46a7be0ac03f6e5370358c19c2aa1a099c797ff7f0d6a6da70e0"} Feb 27 07:09:04 crc kubenswrapper[4725]: I0227 07:09:04.958831 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzrvs" event={"ID":"227d2548-27eb-4d75-9e74-56d0076775b2","Type":"ContainerStarted","Data":"07481d51e550115d1da5a042497fb72cc96816e86a45b9499898bea1a595bce4"} Feb 27 07:09:05 crc kubenswrapper[4725]: I0227 07:09:05.277645 4725 scope.go:117] "RemoveContainer" containerID="57dc0075556bc17b7bb4b9a8e4baf49a52cb7edc77f090e1f3072a18a799c4e4" Feb 27 07:09:05 crc kubenswrapper[4725]: I0227 07:09:05.972466 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzrvs" event={"ID":"227d2548-27eb-4d75-9e74-56d0076775b2","Type":"ContainerStarted","Data":"53c6515c523d6332de31bcb21f4d890c9a10dcd19f52765195bb045f9acf81a9"} Feb 27 07:09:07 crc kubenswrapper[4725]: I0227 07:09:07.992443 4725 generic.go:334] "Generic (PLEG): container finished" podID="227d2548-27eb-4d75-9e74-56d0076775b2" containerID="53c6515c523d6332de31bcb21f4d890c9a10dcd19f52765195bb045f9acf81a9" exitCode=0 Feb 27 07:09:07 crc kubenswrapper[4725]: I0227 07:09:07.992496 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzrvs" event={"ID":"227d2548-27eb-4d75-9e74-56d0076775b2","Type":"ContainerDied","Data":"53c6515c523d6332de31bcb21f4d890c9a10dcd19f52765195bb045f9acf81a9"} Feb 27 07:09:09 crc kubenswrapper[4725]: I0227 07:09:09.009597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzrvs" event={"ID":"227d2548-27eb-4d75-9e74-56d0076775b2","Type":"ContainerStarted","Data":"3ff58bf466468162ea323b3ce176274ecebc74b626b7f9d516a82890ad5ab8eb"} Feb 27 07:09:09 crc kubenswrapper[4725]: I0227 07:09:09.043631 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hzrvs" podStartSLOduration=2.347075957 podStartE2EDuration="6.04361148s" podCreationTimestamp="2026-02-27 07:09:03 +0000 UTC" firstStartedPulling="2026-02-27 07:09:04.958106713 +0000 UTC m=+3523.420727282" lastFinishedPulling="2026-02-27 07:09:08.654642226 +0000 UTC m=+3527.117262805" observedRunningTime="2026-02-27 07:09:09.029861141 +0000 UTC m=+3527.492481720" watchObservedRunningTime="2026-02-27 07:09:09.04361148 +0000 UTC m=+3527.506232049" Feb 27 07:09:10 crc kubenswrapper[4725]: I0227 07:09:10.957568 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lvcpg"] Feb 27 07:09:10 crc kubenswrapper[4725]: I0227 07:09:10.960978 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:10 crc kubenswrapper[4725]: I0227 07:09:10.976912 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvcpg"] Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.050111 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-catalog-content\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.050169 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-utilities\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.050224 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd577\" (UniqueName: \"kubernetes.io/projected/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-kube-api-access-sd577\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.151978 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd577\" (UniqueName: \"kubernetes.io/projected/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-kube-api-access-sd577\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.152150 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-catalog-content\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.152189 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-utilities\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.152594 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-utilities\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.153028 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-catalog-content\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.174211 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd577\" (UniqueName: \"kubernetes.io/projected/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-kube-api-access-sd577\") pod \"community-operators-lvcpg\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.284833 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:11 crc kubenswrapper[4725]: W0227 07:09:11.841055 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7801a2d7_1a2e_453a_964b_1cbd874ee5e4.slice/crio-a491d23834f88a13f6cbd91bb189dbb299b8406de93dbca13ebd95e4d76a0994 WatchSource:0}: Error finding container a491d23834f88a13f6cbd91bb189dbb299b8406de93dbca13ebd95e4d76a0994: Status 404 returned error can't find the container with id a491d23834f88a13f6cbd91bb189dbb299b8406de93dbca13ebd95e4d76a0994 Feb 27 07:09:11 crc kubenswrapper[4725]: I0227 07:09:11.849384 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvcpg"] Feb 27 07:09:12 crc kubenswrapper[4725]: I0227 07:09:12.039617 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvcpg" event={"ID":"7801a2d7-1a2e-453a-964b-1cbd874ee5e4","Type":"ContainerStarted","Data":"a491d23834f88a13f6cbd91bb189dbb299b8406de93dbca13ebd95e4d76a0994"} Feb 27 07:09:13 crc kubenswrapper[4725]: I0227 07:09:13.049637 4725 generic.go:334] "Generic (PLEG): container finished" podID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerID="206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d" exitCode=0 Feb 27 07:09:13 crc kubenswrapper[4725]: I0227 07:09:13.049723 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvcpg" event={"ID":"7801a2d7-1a2e-453a-964b-1cbd874ee5e4","Type":"ContainerDied","Data":"206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d"} Feb 27 07:09:13 crc kubenswrapper[4725]: I0227 07:09:13.897813 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:13 crc kubenswrapper[4725]: I0227 07:09:13.898425 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:13 crc kubenswrapper[4725]: I0227 07:09:13.945857 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:14 crc kubenswrapper[4725]: I0227 07:09:14.117849 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:15 crc kubenswrapper[4725]: I0227 07:09:15.072146 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvcpg" event={"ID":"7801a2d7-1a2e-453a-964b-1cbd874ee5e4","Type":"ContainerStarted","Data":"fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4"} Feb 27 07:09:16 crc kubenswrapper[4725]: I0227 07:09:16.345752 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hzrvs"] Feb 27 07:09:16 crc kubenswrapper[4725]: I0227 07:09:16.346226 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hzrvs" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="registry-server" containerID="cri-o://3ff58bf466468162ea323b3ce176274ecebc74b626b7f9d516a82890ad5ab8eb" gracePeriod=2 Feb 27 07:09:17 crc kubenswrapper[4725]: I0227 07:09:17.092801 4725 generic.go:334] "Generic (PLEG): container finished" podID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerID="fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4" exitCode=0 Feb 27 07:09:17 crc kubenswrapper[4725]: I0227 07:09:17.092875 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvcpg" event={"ID":"7801a2d7-1a2e-453a-964b-1cbd874ee5e4","Type":"ContainerDied","Data":"fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4"} Feb 27 07:09:17 crc kubenswrapper[4725]: I0227 07:09:17.096678 4725 generic.go:334] "Generic (PLEG): container finished" podID="227d2548-27eb-4d75-9e74-56d0076775b2" containerID="3ff58bf466468162ea323b3ce176274ecebc74b626b7f9d516a82890ad5ab8eb" exitCode=0 Feb 27 07:09:17 crc kubenswrapper[4725]: I0227 07:09:17.096742 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzrvs" event={"ID":"227d2548-27eb-4d75-9e74-56d0076775b2","Type":"ContainerDied","Data":"3ff58bf466468162ea323b3ce176274ecebc74b626b7f9d516a82890ad5ab8eb"} Feb 27 07:09:17 crc kubenswrapper[4725]: I0227 07:09:17.983491 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.106980 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hzrvs" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.106970 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hzrvs" event={"ID":"227d2548-27eb-4d75-9e74-56d0076775b2","Type":"ContainerDied","Data":"07481d51e550115d1da5a042497fb72cc96816e86a45b9499898bea1a595bce4"} Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.107147 4725 scope.go:117] "RemoveContainer" containerID="3ff58bf466468162ea323b3ce176274ecebc74b626b7f9d516a82890ad5ab8eb" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.109971 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvcpg" event={"ID":"7801a2d7-1a2e-453a-964b-1cbd874ee5e4","Type":"ContainerStarted","Data":"05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba"} Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.124268 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-utilities\") pod \"227d2548-27eb-4d75-9e74-56d0076775b2\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.124376 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6gpj\" (UniqueName: \"kubernetes.io/projected/227d2548-27eb-4d75-9e74-56d0076775b2-kube-api-access-l6gpj\") pod \"227d2548-27eb-4d75-9e74-56d0076775b2\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.124516 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-catalog-content\") pod \"227d2548-27eb-4d75-9e74-56d0076775b2\" (UID: \"227d2548-27eb-4d75-9e74-56d0076775b2\") " Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.125001 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-utilities" (OuterVolumeSpecName: "utilities") pod "227d2548-27eb-4d75-9e74-56d0076775b2" (UID: "227d2548-27eb-4d75-9e74-56d0076775b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.125428 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.134261 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lvcpg" podStartSLOduration=3.479703344 podStartE2EDuration="8.134241693s" podCreationTimestamp="2026-02-27 07:09:10 +0000 UTC" firstStartedPulling="2026-02-27 07:09:13.051689707 +0000 UTC m=+3531.514310276" lastFinishedPulling="2026-02-27 07:09:17.706228046 +0000 UTC m=+3536.168848625" observedRunningTime="2026-02-27 07:09:18.133569294 +0000 UTC m=+3536.596189873" watchObservedRunningTime="2026-02-27 07:09:18.134241693 +0000 UTC m=+3536.596862262" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.136101 4725 scope.go:117] "RemoveContainer" containerID="53c6515c523d6332de31bcb21f4d890c9a10dcd19f52765195bb045f9acf81a9" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.151448 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/227d2548-27eb-4d75-9e74-56d0076775b2-kube-api-access-l6gpj" (OuterVolumeSpecName: "kube-api-access-l6gpj") pod "227d2548-27eb-4d75-9e74-56d0076775b2" (UID: "227d2548-27eb-4d75-9e74-56d0076775b2"). InnerVolumeSpecName "kube-api-access-l6gpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.177640 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "227d2548-27eb-4d75-9e74-56d0076775b2" (UID: "227d2548-27eb-4d75-9e74-56d0076775b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.199315 4725 scope.go:117] "RemoveContainer" containerID="54a809ae459a46a7be0ac03f6e5370358c19c2aa1a099c797ff7f0d6a6da70e0" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.226963 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227d2548-27eb-4d75-9e74-56d0076775b2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.226996 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6gpj\" (UniqueName: \"kubernetes.io/projected/227d2548-27eb-4d75-9e74-56d0076775b2-kube-api-access-l6gpj\") on node \"crc\" DevicePath \"\"" Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.431504 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hzrvs"] Feb 27 07:09:18 crc kubenswrapper[4725]: I0227 07:09:18.440637 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hzrvs"] Feb 27 07:09:20 crc kubenswrapper[4725]: I0227 07:09:20.268383 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" path="/var/lib/kubelet/pods/227d2548-27eb-4d75-9e74-56d0076775b2/volumes" Feb 27 07:09:21 crc kubenswrapper[4725]: I0227 07:09:21.285491 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:21 crc kubenswrapper[4725]: I0227 07:09:21.285600 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:21 crc kubenswrapper[4725]: I0227 07:09:21.344103 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:23 crc kubenswrapper[4725]: I0227 07:09:23.255399 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:24 crc kubenswrapper[4725]: I0227 07:09:24.349704 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvcpg"] Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.181706 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lvcpg" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="registry-server" containerID="cri-o://05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba" gracePeriod=2 Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.791941 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.912888 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-catalog-content\") pod \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.913111 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-utilities\") pod \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.913194 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd577\" (UniqueName: \"kubernetes.io/projected/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-kube-api-access-sd577\") pod \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\" (UID: \"7801a2d7-1a2e-453a-964b-1cbd874ee5e4\") " Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.913920 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-utilities" (OuterVolumeSpecName: "utilities") pod "7801a2d7-1a2e-453a-964b-1cbd874ee5e4" (UID: "7801a2d7-1a2e-453a-964b-1cbd874ee5e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.924636 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-kube-api-access-sd577" (OuterVolumeSpecName: "kube-api-access-sd577") pod "7801a2d7-1a2e-453a-964b-1cbd874ee5e4" (UID: "7801a2d7-1a2e-453a-964b-1cbd874ee5e4"). InnerVolumeSpecName "kube-api-access-sd577". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:09:25 crc kubenswrapper[4725]: I0227 07:09:25.966975 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7801a2d7-1a2e-453a-964b-1cbd874ee5e4" (UID: "7801a2d7-1a2e-453a-964b-1cbd874ee5e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.016127 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.016169 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.016182 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd577\" (UniqueName: \"kubernetes.io/projected/7801a2d7-1a2e-453a-964b-1cbd874ee5e4-kube-api-access-sd577\") on node \"crc\" DevicePath \"\"" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.195510 4725 generic.go:334] "Generic (PLEG): container finished" podID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerID="05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba" exitCode=0 Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.195581 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvcpg" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.195578 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvcpg" event={"ID":"7801a2d7-1a2e-453a-964b-1cbd874ee5e4","Type":"ContainerDied","Data":"05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba"} Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.195649 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvcpg" event={"ID":"7801a2d7-1a2e-453a-964b-1cbd874ee5e4","Type":"ContainerDied","Data":"a491d23834f88a13f6cbd91bb189dbb299b8406de93dbca13ebd95e4d76a0994"} Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.195679 4725 scope.go:117] "RemoveContainer" containerID="05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.229201 4725 scope.go:117] "RemoveContainer" containerID="fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.237746 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvcpg"] Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.250498 4725 scope.go:117] "RemoveContainer" containerID="206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.266080 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lvcpg"] Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.315493 4725 scope.go:117] "RemoveContainer" containerID="05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba" Feb 27 07:09:26 crc kubenswrapper[4725]: E0227 07:09:26.316027 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba\": container with ID starting with 05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba not found: ID does not exist" containerID="05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.316061 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba"} err="failed to get container status \"05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba\": rpc error: code = NotFound desc = could not find container \"05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba\": container with ID starting with 05dbecae4de759def47c8fea52de528401b32b402d5c3838079b7839a453c5ba not found: ID does not exist" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.316116 4725 scope.go:117] "RemoveContainer" containerID="fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4" Feb 27 07:09:26 crc kubenswrapper[4725]: E0227 07:09:26.316639 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4\": container with ID starting with fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4 not found: ID does not exist" containerID="fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.316757 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4"} err="failed to get container status \"fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4\": rpc error: code = NotFound desc = could not find container \"fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4\": container with ID starting with fa4498a55914a1634085ce2c8e6d465226a0c8443dc914fe15b8a6ecf08719b4 not found: ID does not exist" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.316773 4725 scope.go:117] "RemoveContainer" containerID="206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d" Feb 27 07:09:26 crc kubenswrapper[4725]: E0227 07:09:26.317196 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d\": container with ID starting with 206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d not found: ID does not exist" containerID="206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d" Feb 27 07:09:26 crc kubenswrapper[4725]: I0227 07:09:26.317228 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d"} err="failed to get container status \"206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d\": rpc error: code = NotFound desc = could not find container \"206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d\": container with ID starting with 206cd78842457677daf1714a02e68500ae50e2ad4b9119db7cab36b97ea8f99d not found: ID does not exist" Feb 27 07:09:28 crc kubenswrapper[4725]: I0227 07:09:28.271706 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" path="/var/lib/kubelet/pods/7801a2d7-1a2e-453a-964b-1cbd874ee5e4/volumes" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.160049 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536270-mxd4f"] Feb 27 07:10:00 crc kubenswrapper[4725]: E0227 07:10:00.161258 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="registry-server" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161308 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="registry-server" Feb 27 07:10:00 crc kubenswrapper[4725]: E0227 07:10:00.161324 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="extract-utilities" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161335 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="extract-utilities" Feb 27 07:10:00 crc kubenswrapper[4725]: E0227 07:10:00.161365 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="extract-content" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161378 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="extract-content" Feb 27 07:10:00 crc kubenswrapper[4725]: E0227 07:10:00.161395 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="extract-utilities" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161403 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="extract-utilities" Feb 27 07:10:00 crc kubenswrapper[4725]: E0227 07:10:00.161425 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="extract-content" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161433 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="extract-content" Feb 27 07:10:00 crc kubenswrapper[4725]: E0227 07:10:00.161456 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="registry-server" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161464 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="registry-server" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161756 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7801a2d7-1a2e-453a-964b-1cbd874ee5e4" containerName="registry-server" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.161798 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="227d2548-27eb-4d75-9e74-56d0076775b2" containerName="registry-server" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.162750 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.165568 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.165633 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.165840 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.169892 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536270-mxd4f"] Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.273699 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q4kq\" (UniqueName: \"kubernetes.io/projected/a67ece04-c2fe-4ac6-9a3b-128336bca685-kube-api-access-4q4kq\") pod \"auto-csr-approver-29536270-mxd4f\" (UID: \"a67ece04-c2fe-4ac6-9a3b-128336bca685\") " pod="openshift-infra/auto-csr-approver-29536270-mxd4f" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.375418 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q4kq\" (UniqueName: \"kubernetes.io/projected/a67ece04-c2fe-4ac6-9a3b-128336bca685-kube-api-access-4q4kq\") pod \"auto-csr-approver-29536270-mxd4f\" (UID: \"a67ece04-c2fe-4ac6-9a3b-128336bca685\") " pod="openshift-infra/auto-csr-approver-29536270-mxd4f" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.396906 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q4kq\" (UniqueName: \"kubernetes.io/projected/a67ece04-c2fe-4ac6-9a3b-128336bca685-kube-api-access-4q4kq\") pod \"auto-csr-approver-29536270-mxd4f\" (UID: \"a67ece04-c2fe-4ac6-9a3b-128336bca685\") " pod="openshift-infra/auto-csr-approver-29536270-mxd4f" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.483761 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" Feb 27 07:10:00 crc kubenswrapper[4725]: I0227 07:10:00.934890 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536270-mxd4f"] Feb 27 07:10:01 crc kubenswrapper[4725]: I0227 07:10:01.563191 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" event={"ID":"a67ece04-c2fe-4ac6-9a3b-128336bca685","Type":"ContainerStarted","Data":"4a461d469137bf468f92902f719ac7c8ee0d5a2d8b27ede937f3b1c3ee9d855d"} Feb 27 07:10:02 crc kubenswrapper[4725]: I0227 07:10:02.572401 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" event={"ID":"a67ece04-c2fe-4ac6-9a3b-128336bca685","Type":"ContainerStarted","Data":"3098b1a24b6bf7af9cb3dfa01637612de8f1157bb55ecb873f63aca375c82682"} Feb 27 07:10:02 crc kubenswrapper[4725]: I0227 07:10:02.591072 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" podStartSLOduration=1.451977849 podStartE2EDuration="2.591052672s" podCreationTimestamp="2026-02-27 07:10:00 +0000 UTC" firstStartedPulling="2026-02-27 07:10:00.947733918 +0000 UTC m=+3579.410354507" lastFinishedPulling="2026-02-27 07:10:02.086808761 +0000 UTC m=+3580.549429330" observedRunningTime="2026-02-27 07:10:02.583121818 +0000 UTC m=+3581.045742387" watchObservedRunningTime="2026-02-27 07:10:02.591052672 +0000 UTC m=+3581.053673241" Feb 27 07:10:03 crc kubenswrapper[4725]: I0227 07:10:03.583124 4725 generic.go:334] "Generic (PLEG): container finished" podID="a67ece04-c2fe-4ac6-9a3b-128336bca685" containerID="3098b1a24b6bf7af9cb3dfa01637612de8f1157bb55ecb873f63aca375c82682" exitCode=0 Feb 27 07:10:03 crc kubenswrapper[4725]: I0227 07:10:03.583253 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" event={"ID":"a67ece04-c2fe-4ac6-9a3b-128336bca685","Type":"ContainerDied","Data":"3098b1a24b6bf7af9cb3dfa01637612de8f1157bb55ecb873f63aca375c82682"} Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.086924 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.175782 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q4kq\" (UniqueName: \"kubernetes.io/projected/a67ece04-c2fe-4ac6-9a3b-128336bca685-kube-api-access-4q4kq\") pod \"a67ece04-c2fe-4ac6-9a3b-128336bca685\" (UID: \"a67ece04-c2fe-4ac6-9a3b-128336bca685\") " Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.181739 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a67ece04-c2fe-4ac6-9a3b-128336bca685-kube-api-access-4q4kq" (OuterVolumeSpecName: "kube-api-access-4q4kq") pod "a67ece04-c2fe-4ac6-9a3b-128336bca685" (UID: "a67ece04-c2fe-4ac6-9a3b-128336bca685"). InnerVolumeSpecName "kube-api-access-4q4kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.278207 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q4kq\" (UniqueName: \"kubernetes.io/projected/a67ece04-c2fe-4ac6-9a3b-128336bca685-kube-api-access-4q4kq\") on node \"crc\" DevicePath \"\"" Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.347529 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536264-ktxjg"] Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.362080 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536264-ktxjg"] Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.607103 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" event={"ID":"a67ece04-c2fe-4ac6-9a3b-128336bca685","Type":"ContainerDied","Data":"4a461d469137bf468f92902f719ac7c8ee0d5a2d8b27ede937f3b1c3ee9d855d"} Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.607143 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536270-mxd4f" Feb 27 07:10:05 crc kubenswrapper[4725]: I0227 07:10:05.607152 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a461d469137bf468f92902f719ac7c8ee0d5a2d8b27ede937f3b1c3ee9d855d" Feb 27 07:10:06 crc kubenswrapper[4725]: I0227 07:10:06.265050 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc3820e-2105-4ce3-be43-598af2af2beb" path="/var/lib/kubelet/pods/ccc3820e-2105-4ce3-be43-598af2af2beb/volumes" Feb 27 07:10:32 crc kubenswrapper[4725]: I0227 07:10:32.554428 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:10:32 crc kubenswrapper[4725]: I0227 07:10:32.554971 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:11:02 crc kubenswrapper[4725]: I0227 07:11:02.554931 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:11:02 crc kubenswrapper[4725]: I0227 07:11:02.555646 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:11:05 crc kubenswrapper[4725]: I0227 07:11:05.405503 4725 scope.go:117] "RemoveContainer" containerID="71571c1166ee97ec13f1122e8982ff874bc1df6a6333451ff44e51e9fe5b9480" Feb 27 07:11:32 crc kubenswrapper[4725]: I0227 07:11:32.554216 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:11:32 crc kubenswrapper[4725]: I0227 07:11:32.556178 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:11:32 crc kubenswrapper[4725]: I0227 07:11:32.556394 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:11:32 crc kubenswrapper[4725]: I0227 07:11:32.557920 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:11:32 crc kubenswrapper[4725]: I0227 07:11:32.558173 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" gracePeriod=600 Feb 27 07:11:32 crc kubenswrapper[4725]: E0227 07:11:32.680462 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:11:33 crc kubenswrapper[4725]: I0227 07:11:33.510391 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" exitCode=0 Feb 27 07:11:33 crc kubenswrapper[4725]: I0227 07:11:33.510447 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc"} Feb 27 07:11:33 crc kubenswrapper[4725]: I0227 07:11:33.510485 4725 scope.go:117] "RemoveContainer" containerID="e601c5ecb917d019ad7cd04e1aba76d1fc6030f46e6fcfdadfb5f33b258056ba" Feb 27 07:11:33 crc kubenswrapper[4725]: I0227 07:11:33.511245 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:11:33 crc kubenswrapper[4725]: E0227 07:11:33.511765 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:11:45 crc kubenswrapper[4725]: I0227 07:11:45.251724 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:11:45 crc kubenswrapper[4725]: E0227 07:11:45.252734 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:11:59 crc kubenswrapper[4725]: I0227 07:11:59.251776 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:11:59 crc kubenswrapper[4725]: E0227 07:11:59.252454 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.148117 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536272-dcvj4"] Feb 27 07:12:00 crc kubenswrapper[4725]: E0227 07:12:00.149342 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67ece04-c2fe-4ac6-9a3b-128336bca685" containerName="oc" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.149484 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67ece04-c2fe-4ac6-9a3b-128336bca685" containerName="oc" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.149855 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67ece04-c2fe-4ac6-9a3b-128336bca685" containerName="oc" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.150811 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536272-dcvj4" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.153573 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.153727 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.154189 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.162150 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536272-dcvj4"] Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.305815 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzsss\" (UniqueName: \"kubernetes.io/projected/91e98a3d-4b50-4125-a8bd-f23c983a3606-kube-api-access-wzsss\") pod \"auto-csr-approver-29536272-dcvj4\" (UID: \"91e98a3d-4b50-4125-a8bd-f23c983a3606\") " pod="openshift-infra/auto-csr-approver-29536272-dcvj4" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.407983 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzsss\" (UniqueName: \"kubernetes.io/projected/91e98a3d-4b50-4125-a8bd-f23c983a3606-kube-api-access-wzsss\") pod \"auto-csr-approver-29536272-dcvj4\" (UID: \"91e98a3d-4b50-4125-a8bd-f23c983a3606\") " pod="openshift-infra/auto-csr-approver-29536272-dcvj4" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.435835 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzsss\" (UniqueName: \"kubernetes.io/projected/91e98a3d-4b50-4125-a8bd-f23c983a3606-kube-api-access-wzsss\") pod \"auto-csr-approver-29536272-dcvj4\" (UID: \"91e98a3d-4b50-4125-a8bd-f23c983a3606\") " pod="openshift-infra/auto-csr-approver-29536272-dcvj4" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.472279 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536272-dcvj4" Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.935568 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536272-dcvj4"] Feb 27 07:12:00 crc kubenswrapper[4725]: I0227 07:12:00.942089 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:12:01 crc kubenswrapper[4725]: I0227 07:12:01.790385 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536272-dcvj4" event={"ID":"91e98a3d-4b50-4125-a8bd-f23c983a3606","Type":"ContainerStarted","Data":"cb773399e14d02c7ed71d041437ba98ce32e9505c0b183516f526698fd9c4b11"} Feb 27 07:12:03 crc kubenswrapper[4725]: I0227 07:12:03.807158 4725 generic.go:334] "Generic (PLEG): container finished" podID="91e98a3d-4b50-4125-a8bd-f23c983a3606" containerID="8c217c0ee5a3f1ebbb901aede389857cbc95e4886c6a1f05ba07f76e15cf1cd7" exitCode=0 Feb 27 07:12:03 crc kubenswrapper[4725]: I0227 07:12:03.807207 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536272-dcvj4" event={"ID":"91e98a3d-4b50-4125-a8bd-f23c983a3606","Type":"ContainerDied","Data":"8c217c0ee5a3f1ebbb901aede389857cbc95e4886c6a1f05ba07f76e15cf1cd7"} Feb 27 07:12:05 crc kubenswrapper[4725]: I0227 07:12:05.230978 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536272-dcvj4" Feb 27 07:12:05 crc kubenswrapper[4725]: I0227 07:12:05.402256 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzsss\" (UniqueName: \"kubernetes.io/projected/91e98a3d-4b50-4125-a8bd-f23c983a3606-kube-api-access-wzsss\") pod \"91e98a3d-4b50-4125-a8bd-f23c983a3606\" (UID: \"91e98a3d-4b50-4125-a8bd-f23c983a3606\") " Feb 27 07:12:05 crc kubenswrapper[4725]: I0227 07:12:05.409545 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e98a3d-4b50-4125-a8bd-f23c983a3606-kube-api-access-wzsss" (OuterVolumeSpecName: "kube-api-access-wzsss") pod "91e98a3d-4b50-4125-a8bd-f23c983a3606" (UID: "91e98a3d-4b50-4125-a8bd-f23c983a3606"). InnerVolumeSpecName "kube-api-access-wzsss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:12:05 crc kubenswrapper[4725]: I0227 07:12:05.505578 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzsss\" (UniqueName: \"kubernetes.io/projected/91e98a3d-4b50-4125-a8bd-f23c983a3606-kube-api-access-wzsss\") on node \"crc\" DevicePath \"\"" Feb 27 07:12:05 crc kubenswrapper[4725]: I0227 07:12:05.827314 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536272-dcvj4" event={"ID":"91e98a3d-4b50-4125-a8bd-f23c983a3606","Type":"ContainerDied","Data":"cb773399e14d02c7ed71d041437ba98ce32e9505c0b183516f526698fd9c4b11"} Feb 27 07:12:05 crc kubenswrapper[4725]: I0227 07:12:05.827358 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb773399e14d02c7ed71d041437ba98ce32e9505c0b183516f526698fd9c4b11" Feb 27 07:12:05 crc kubenswrapper[4725]: I0227 07:12:05.827397 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536272-dcvj4" Feb 27 07:12:06 crc kubenswrapper[4725]: I0227 07:12:06.299475 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536266-lkv6f"] Feb 27 07:12:06 crc kubenswrapper[4725]: I0227 07:12:06.312164 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536266-lkv6f"] Feb 27 07:12:08 crc kubenswrapper[4725]: I0227 07:12:08.262671 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81f47c6-3359-48e1-9dd5-68dcd62c5996" path="/var/lib/kubelet/pods/f81f47c6-3359-48e1-9dd5-68dcd62c5996/volumes" Feb 27 07:12:10 crc kubenswrapper[4725]: I0227 07:12:10.251518 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:12:10 crc kubenswrapper[4725]: E0227 07:12:10.251794 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:12:23 crc kubenswrapper[4725]: I0227 07:12:23.253179 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:12:23 crc kubenswrapper[4725]: E0227 07:12:23.254200 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:12:36 crc kubenswrapper[4725]: I0227 07:12:36.252920 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:12:36 crc kubenswrapper[4725]: E0227 07:12:36.254273 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:12:51 crc kubenswrapper[4725]: I0227 07:12:51.252740 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:12:51 crc kubenswrapper[4725]: E0227 07:12:51.254881 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:13:05 crc kubenswrapper[4725]: I0227 07:13:05.252147 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:13:05 crc kubenswrapper[4725]: E0227 07:13:05.252880 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:13:05 crc kubenswrapper[4725]: I0227 07:13:05.523637 4725 scope.go:117] "RemoveContainer" containerID="9f8b5faf75da22675f6f9a2316127469fef8980e6eff4a813f852cdc42877e73" Feb 27 07:13:18 crc kubenswrapper[4725]: I0227 07:13:18.252049 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:13:18 crc kubenswrapper[4725]: E0227 07:13:18.253696 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:13:33 crc kubenswrapper[4725]: I0227 07:13:33.252095 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:13:33 crc kubenswrapper[4725]: E0227 07:13:33.252912 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:13:46 crc kubenswrapper[4725]: I0227 07:13:46.251975 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:13:46 crc kubenswrapper[4725]: E0227 07:13:46.253664 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:13:57 crc kubenswrapper[4725]: I0227 07:13:57.251774 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:13:57 crc kubenswrapper[4725]: E0227 07:13:57.252883 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.149310 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536274-jhndd"] Feb 27 07:14:00 crc kubenswrapper[4725]: E0227 07:14:00.150521 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e98a3d-4b50-4125-a8bd-f23c983a3606" containerName="oc" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.150546 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e98a3d-4b50-4125-a8bd-f23c983a3606" containerName="oc" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.150942 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e98a3d-4b50-4125-a8bd-f23c983a3606" containerName="oc" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.151963 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536274-jhndd" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.153627 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.153745 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.154028 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.160351 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536274-jhndd"] Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.263118 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8kcg\" (UniqueName: \"kubernetes.io/projected/9273eb45-5505-47f1-83d9-cdde7774cb2a-kube-api-access-f8kcg\") pod \"auto-csr-approver-29536274-jhndd\" (UID: \"9273eb45-5505-47f1-83d9-cdde7774cb2a\") " pod="openshift-infra/auto-csr-approver-29536274-jhndd" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.365407 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8kcg\" (UniqueName: \"kubernetes.io/projected/9273eb45-5505-47f1-83d9-cdde7774cb2a-kube-api-access-f8kcg\") pod \"auto-csr-approver-29536274-jhndd\" (UID: \"9273eb45-5505-47f1-83d9-cdde7774cb2a\") " pod="openshift-infra/auto-csr-approver-29536274-jhndd" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.384329 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8kcg\" (UniqueName: \"kubernetes.io/projected/9273eb45-5505-47f1-83d9-cdde7774cb2a-kube-api-access-f8kcg\") pod \"auto-csr-approver-29536274-jhndd\" (UID: \"9273eb45-5505-47f1-83d9-cdde7774cb2a\") " pod="openshift-infra/auto-csr-approver-29536274-jhndd" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.480192 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536274-jhndd" Feb 27 07:14:00 crc kubenswrapper[4725]: I0227 07:14:00.966148 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536274-jhndd"] Feb 27 07:14:01 crc kubenswrapper[4725]: I0227 07:14:01.103017 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536274-jhndd" event={"ID":"9273eb45-5505-47f1-83d9-cdde7774cb2a","Type":"ContainerStarted","Data":"7e9e1fc7f1b2408d12f34a6393696fa7a2feac6f45ab25b2636bb6f03cdc8e35"} Feb 27 07:14:03 crc kubenswrapper[4725]: I0227 07:14:03.123478 4725 generic.go:334] "Generic (PLEG): container finished" podID="9273eb45-5505-47f1-83d9-cdde7774cb2a" containerID="92eeb95ae20328609d196d9049fd05ff8ae0b5df1f7ff202a8bb0f4b84bf1704" exitCode=0 Feb 27 07:14:03 crc kubenswrapper[4725]: I0227 07:14:03.123541 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536274-jhndd" event={"ID":"9273eb45-5505-47f1-83d9-cdde7774cb2a","Type":"ContainerDied","Data":"92eeb95ae20328609d196d9049fd05ff8ae0b5df1f7ff202a8bb0f4b84bf1704"} Feb 27 07:14:04 crc kubenswrapper[4725]: I0227 07:14:04.502752 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536274-jhndd" Feb 27 07:14:04 crc kubenswrapper[4725]: I0227 07:14:04.572534 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8kcg\" (UniqueName: \"kubernetes.io/projected/9273eb45-5505-47f1-83d9-cdde7774cb2a-kube-api-access-f8kcg\") pod \"9273eb45-5505-47f1-83d9-cdde7774cb2a\" (UID: \"9273eb45-5505-47f1-83d9-cdde7774cb2a\") " Feb 27 07:14:04 crc kubenswrapper[4725]: I0227 07:14:04.590838 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9273eb45-5505-47f1-83d9-cdde7774cb2a-kube-api-access-f8kcg" (OuterVolumeSpecName: "kube-api-access-f8kcg") pod "9273eb45-5505-47f1-83d9-cdde7774cb2a" (UID: "9273eb45-5505-47f1-83d9-cdde7774cb2a"). InnerVolumeSpecName "kube-api-access-f8kcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:14:04 crc kubenswrapper[4725]: I0227 07:14:04.674853 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8kcg\" (UniqueName: \"kubernetes.io/projected/9273eb45-5505-47f1-83d9-cdde7774cb2a-kube-api-access-f8kcg\") on node \"crc\" DevicePath \"\"" Feb 27 07:14:05 crc kubenswrapper[4725]: I0227 07:14:05.143067 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536274-jhndd" event={"ID":"9273eb45-5505-47f1-83d9-cdde7774cb2a","Type":"ContainerDied","Data":"7e9e1fc7f1b2408d12f34a6393696fa7a2feac6f45ab25b2636bb6f03cdc8e35"} Feb 27 07:14:05 crc kubenswrapper[4725]: I0227 07:14:05.143112 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e9e1fc7f1b2408d12f34a6393696fa7a2feac6f45ab25b2636bb6f03cdc8e35" Feb 27 07:14:05 crc kubenswrapper[4725]: I0227 07:14:05.143137 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536274-jhndd" Feb 27 07:14:05 crc kubenswrapper[4725]: I0227 07:14:05.597336 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536268-zknfx"] Feb 27 07:14:05 crc kubenswrapper[4725]: I0227 07:14:05.619816 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536268-zknfx"] Feb 27 07:14:06 crc kubenswrapper[4725]: I0227 07:14:06.262025 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76c3c4c-5b59-4736-9222-90051a209498" path="/var/lib/kubelet/pods/b76c3c4c-5b59-4736-9222-90051a209498/volumes" Feb 27 07:14:10 crc kubenswrapper[4725]: I0227 07:14:10.252630 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:14:10 crc kubenswrapper[4725]: E0227 07:14:10.253969 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:14:25 crc kubenswrapper[4725]: I0227 07:14:25.252607 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:14:25 crc kubenswrapper[4725]: E0227 07:14:25.253734 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:14:36 crc kubenswrapper[4725]: I0227 07:14:36.253052 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:14:36 crc kubenswrapper[4725]: E0227 07:14:36.254526 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:14:49 crc kubenswrapper[4725]: I0227 07:14:49.253156 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:14:49 crc kubenswrapper[4725]: E0227 07:14:49.254171 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.162749 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f"] Feb 27 07:15:00 crc kubenswrapper[4725]: E0227 07:15:00.164473 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9273eb45-5505-47f1-83d9-cdde7774cb2a" containerName="oc" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.164558 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9273eb45-5505-47f1-83d9-cdde7774cb2a" containerName="oc" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.164812 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9273eb45-5505-47f1-83d9-cdde7774cb2a" containerName="oc" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.165574 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.167841 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.168002 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.176638 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f"] Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.271035 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a50807f4-a5f2-4bfe-ae16-317194fe97da-config-volume\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.271185 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zhsf\" (UniqueName: \"kubernetes.io/projected/a50807f4-a5f2-4bfe-ae16-317194fe97da-kube-api-access-7zhsf\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.271280 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a50807f4-a5f2-4bfe-ae16-317194fe97da-secret-volume\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.374023 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a50807f4-a5f2-4bfe-ae16-317194fe97da-config-volume\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.374267 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zhsf\" (UniqueName: \"kubernetes.io/projected/a50807f4-a5f2-4bfe-ae16-317194fe97da-kube-api-access-7zhsf\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.374551 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a50807f4-a5f2-4bfe-ae16-317194fe97da-secret-volume\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.375659 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a50807f4-a5f2-4bfe-ae16-317194fe97da-config-volume\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.380488 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a50807f4-a5f2-4bfe-ae16-317194fe97da-secret-volume\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.407927 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zhsf\" (UniqueName: \"kubernetes.io/projected/a50807f4-a5f2-4bfe-ae16-317194fe97da-kube-api-access-7zhsf\") pod \"collect-profiles-29536275-7sp8f\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.498561 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:00 crc kubenswrapper[4725]: I0227 07:15:00.967123 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f"] Feb 27 07:15:01 crc kubenswrapper[4725]: I0227 07:15:01.748495 4725 generic.go:334] "Generic (PLEG): container finished" podID="a50807f4-a5f2-4bfe-ae16-317194fe97da" containerID="09acf6b11d4bcab27ee07e5cd3f53fcbecc3ca1f0d8958a53ec444716c933e1e" exitCode=0 Feb 27 07:15:01 crc kubenswrapper[4725]: I0227 07:15:01.748541 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" event={"ID":"a50807f4-a5f2-4bfe-ae16-317194fe97da","Type":"ContainerDied","Data":"09acf6b11d4bcab27ee07e5cd3f53fcbecc3ca1f0d8958a53ec444716c933e1e"} Feb 27 07:15:01 crc kubenswrapper[4725]: I0227 07:15:01.748787 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" event={"ID":"a50807f4-a5f2-4bfe-ae16-317194fe97da","Type":"ContainerStarted","Data":"8bc21fd427aba8ac0d6a896c45bc16fb3899ae8a954fa8c08f52da9565feb5a9"} Feb 27 07:15:02 crc kubenswrapper[4725]: I0227 07:15:02.266266 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:15:02 crc kubenswrapper[4725]: E0227 07:15:02.266876 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.171227 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.239921 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a50807f4-a5f2-4bfe-ae16-317194fe97da-secret-volume\") pod \"a50807f4-a5f2-4bfe-ae16-317194fe97da\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.240040 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zhsf\" (UniqueName: \"kubernetes.io/projected/a50807f4-a5f2-4bfe-ae16-317194fe97da-kube-api-access-7zhsf\") pod \"a50807f4-a5f2-4bfe-ae16-317194fe97da\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.240186 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a50807f4-a5f2-4bfe-ae16-317194fe97da-config-volume\") pod \"a50807f4-a5f2-4bfe-ae16-317194fe97da\" (UID: \"a50807f4-a5f2-4bfe-ae16-317194fe97da\") " Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.240724 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a50807f4-a5f2-4bfe-ae16-317194fe97da-config-volume" (OuterVolumeSpecName: "config-volume") pod "a50807f4-a5f2-4bfe-ae16-317194fe97da" (UID: "a50807f4-a5f2-4bfe-ae16-317194fe97da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.241022 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a50807f4-a5f2-4bfe-ae16-317194fe97da-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.246224 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a50807f4-a5f2-4bfe-ae16-317194fe97da-kube-api-access-7zhsf" (OuterVolumeSpecName: "kube-api-access-7zhsf") pod "a50807f4-a5f2-4bfe-ae16-317194fe97da" (UID: "a50807f4-a5f2-4bfe-ae16-317194fe97da"). InnerVolumeSpecName "kube-api-access-7zhsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.247559 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a50807f4-a5f2-4bfe-ae16-317194fe97da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a50807f4-a5f2-4bfe-ae16-317194fe97da" (UID: "a50807f4-a5f2-4bfe-ae16-317194fe97da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.344078 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zhsf\" (UniqueName: \"kubernetes.io/projected/a50807f4-a5f2-4bfe-ae16-317194fe97da-kube-api-access-7zhsf\") on node \"crc\" DevicePath \"\"" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.344147 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a50807f4-a5f2-4bfe-ae16-317194fe97da-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.768615 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" event={"ID":"a50807f4-a5f2-4bfe-ae16-317194fe97da","Type":"ContainerDied","Data":"8bc21fd427aba8ac0d6a896c45bc16fb3899ae8a954fa8c08f52da9565feb5a9"} Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.768652 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bc21fd427aba8ac0d6a896c45bc16fb3899ae8a954fa8c08f52da9565feb5a9" Feb 27 07:15:03 crc kubenswrapper[4725]: I0227 07:15:03.768700 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f" Feb 27 07:15:04 crc kubenswrapper[4725]: I0227 07:15:04.249762 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m"] Feb 27 07:15:04 crc kubenswrapper[4725]: I0227 07:15:04.264113 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536230-9h85m"] Feb 27 07:15:05 crc kubenswrapper[4725]: I0227 07:15:05.678966 4725 scope.go:117] "RemoveContainer" containerID="288d3616adda00da859bc2eb52f1be733f94f7fdec6223728c6bddbcf10f2db4" Feb 27 07:15:06 crc kubenswrapper[4725]: I0227 07:15:06.263541 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd6e930-ab13-4ead-9173-1ecc6c561944" path="/var/lib/kubelet/pods/1cd6e930-ab13-4ead-9173-1ecc6c561944/volumes" Feb 27 07:15:16 crc kubenswrapper[4725]: I0227 07:15:16.251836 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:15:16 crc kubenswrapper[4725]: E0227 07:15:16.252699 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:15:27 crc kubenswrapper[4725]: I0227 07:15:27.251699 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:15:27 crc kubenswrapper[4725]: E0227 07:15:27.253804 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:15:42 crc kubenswrapper[4725]: I0227 07:15:42.260179 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:15:42 crc kubenswrapper[4725]: E0227 07:15:42.261243 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:15:54 crc kubenswrapper[4725]: I0227 07:15:54.251438 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:15:54 crc kubenswrapper[4725]: E0227 07:15:54.252034 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.174901 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536276-w95m8"] Feb 27 07:16:00 crc kubenswrapper[4725]: E0227 07:16:00.175909 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50807f4-a5f2-4bfe-ae16-317194fe97da" containerName="collect-profiles" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.175921 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50807f4-a5f2-4bfe-ae16-317194fe97da" containerName="collect-profiles" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.176142 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a50807f4-a5f2-4bfe-ae16-317194fe97da" containerName="collect-profiles" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.177070 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536276-w95m8" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.179881 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.180108 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.180332 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.192588 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536276-w95m8"] Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.325001 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hmgv\" (UniqueName: \"kubernetes.io/projected/dff88595-4388-4081-85f1-51a28a80e5d3-kube-api-access-5hmgv\") pod \"auto-csr-approver-29536276-w95m8\" (UID: \"dff88595-4388-4081-85f1-51a28a80e5d3\") " pod="openshift-infra/auto-csr-approver-29536276-w95m8" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.428206 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hmgv\" (UniqueName: \"kubernetes.io/projected/dff88595-4388-4081-85f1-51a28a80e5d3-kube-api-access-5hmgv\") pod \"auto-csr-approver-29536276-w95m8\" (UID: \"dff88595-4388-4081-85f1-51a28a80e5d3\") " pod="openshift-infra/auto-csr-approver-29536276-w95m8" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.459244 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hmgv\" (UniqueName: \"kubernetes.io/projected/dff88595-4388-4081-85f1-51a28a80e5d3-kube-api-access-5hmgv\") pod \"auto-csr-approver-29536276-w95m8\" (UID: \"dff88595-4388-4081-85f1-51a28a80e5d3\") " pod="openshift-infra/auto-csr-approver-29536276-w95m8" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.500574 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536276-w95m8" Feb 27 07:16:00 crc kubenswrapper[4725]: I0227 07:16:00.988322 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536276-w95m8"] Feb 27 07:16:01 crc kubenswrapper[4725]: I0227 07:16:01.374179 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536276-w95m8" event={"ID":"dff88595-4388-4081-85f1-51a28a80e5d3","Type":"ContainerStarted","Data":"537ff1ba66074f3a9981ac0e91e7de793fc21f5ffddc3021af92623e75445344"} Feb 27 07:16:02 crc kubenswrapper[4725]: I0227 07:16:02.398474 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536276-w95m8" event={"ID":"dff88595-4388-4081-85f1-51a28a80e5d3","Type":"ContainerStarted","Data":"6c06564b8af63d71533db047b5d78ca3781a843280b05178db6c0abe6733cb9f"} Feb 27 07:16:02 crc kubenswrapper[4725]: I0227 07:16:02.445891 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536276-w95m8" podStartSLOduration=1.588940486 podStartE2EDuration="2.445862575s" podCreationTimestamp="2026-02-27 07:16:00 +0000 UTC" firstStartedPulling="2026-02-27 07:16:00.998734406 +0000 UTC m=+3939.461354975" lastFinishedPulling="2026-02-27 07:16:01.855656495 +0000 UTC m=+3940.318277064" observedRunningTime="2026-02-27 07:16:02.412954865 +0000 UTC m=+3940.875575454" watchObservedRunningTime="2026-02-27 07:16:02.445862575 +0000 UTC m=+3940.908483144" Feb 27 07:16:03 crc kubenswrapper[4725]: I0227 07:16:03.410645 4725 generic.go:334] "Generic (PLEG): container finished" podID="dff88595-4388-4081-85f1-51a28a80e5d3" containerID="6c06564b8af63d71533db047b5d78ca3781a843280b05178db6c0abe6733cb9f" exitCode=0 Feb 27 07:16:03 crc kubenswrapper[4725]: I0227 07:16:03.410747 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536276-w95m8" event={"ID":"dff88595-4388-4081-85f1-51a28a80e5d3","Type":"ContainerDied","Data":"6c06564b8af63d71533db047b5d78ca3781a843280b05178db6c0abe6733cb9f"} Feb 27 07:16:04 crc kubenswrapper[4725]: I0227 07:16:04.875017 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536276-w95m8" Feb 27 07:16:04 crc kubenswrapper[4725]: I0227 07:16:04.950549 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hmgv\" (UniqueName: \"kubernetes.io/projected/dff88595-4388-4081-85f1-51a28a80e5d3-kube-api-access-5hmgv\") pod \"dff88595-4388-4081-85f1-51a28a80e5d3\" (UID: \"dff88595-4388-4081-85f1-51a28a80e5d3\") " Feb 27 07:16:04 crc kubenswrapper[4725]: I0227 07:16:04.967619 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff88595-4388-4081-85f1-51a28a80e5d3-kube-api-access-5hmgv" (OuterVolumeSpecName: "kube-api-access-5hmgv") pod "dff88595-4388-4081-85f1-51a28a80e5d3" (UID: "dff88595-4388-4081-85f1-51a28a80e5d3"). InnerVolumeSpecName "kube-api-access-5hmgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:16:05 crc kubenswrapper[4725]: I0227 07:16:05.053226 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hmgv\" (UniqueName: \"kubernetes.io/projected/dff88595-4388-4081-85f1-51a28a80e5d3-kube-api-access-5hmgv\") on node \"crc\" DevicePath \"\"" Feb 27 07:16:05 crc kubenswrapper[4725]: I0227 07:16:05.375231 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536270-mxd4f"] Feb 27 07:16:05 crc kubenswrapper[4725]: I0227 07:16:05.385334 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536270-mxd4f"] Feb 27 07:16:05 crc kubenswrapper[4725]: I0227 07:16:05.430568 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536276-w95m8" event={"ID":"dff88595-4388-4081-85f1-51a28a80e5d3","Type":"ContainerDied","Data":"537ff1ba66074f3a9981ac0e91e7de793fc21f5ffddc3021af92623e75445344"} Feb 27 07:16:05 crc kubenswrapper[4725]: I0227 07:16:05.430613 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="537ff1ba66074f3a9981ac0e91e7de793fc21f5ffddc3021af92623e75445344" Feb 27 07:16:05 crc kubenswrapper[4725]: I0227 07:16:05.430662 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536276-w95m8" Feb 27 07:16:05 crc kubenswrapper[4725]: I0227 07:16:05.771398 4725 scope.go:117] "RemoveContainer" containerID="83b4e247ca46f7289ff1df656689e037beda39fe69ad57127cbfe819ceea5984" Feb 27 07:16:06 crc kubenswrapper[4725]: I0227 07:16:06.265774 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a67ece04-c2fe-4ac6-9a3b-128336bca685" path="/var/lib/kubelet/pods/a67ece04-c2fe-4ac6-9a3b-128336bca685/volumes" Feb 27 07:16:09 crc kubenswrapper[4725]: I0227 07:16:09.251912 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:16:09 crc kubenswrapper[4725]: E0227 07:16:09.252514 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:16:23 crc kubenswrapper[4725]: I0227 07:16:23.251695 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:16:23 crc kubenswrapper[4725]: E0227 07:16:23.253093 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:16:35 crc kubenswrapper[4725]: I0227 07:16:35.252996 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:16:35 crc kubenswrapper[4725]: I0227 07:16:35.763204 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"6208386aa9edfdd355d3967a7f5b53c627800b1b18b9a58493de299b226ea063"} Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.006659 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s795h"] Feb 27 07:16:43 crc kubenswrapper[4725]: E0227 07:16:43.008001 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff88595-4388-4081-85f1-51a28a80e5d3" containerName="oc" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.008022 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff88595-4388-4081-85f1-51a28a80e5d3" containerName="oc" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.008386 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff88595-4388-4081-85f1-51a28a80e5d3" containerName="oc" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.011757 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.031362 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s795h"] Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.060797 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-catalog-content\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.060846 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqg9\" (UniqueName: \"kubernetes.io/projected/dd391ba4-f3a3-46ae-911d-2fdbe084e782-kube-api-access-9xqg9\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.060932 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-utilities\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.162232 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-catalog-content\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.162322 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqg9\" (UniqueName: \"kubernetes.io/projected/dd391ba4-f3a3-46ae-911d-2fdbe084e782-kube-api-access-9xqg9\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.162460 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-utilities\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.162778 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-catalog-content\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.162896 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-utilities\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.191376 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqg9\" (UniqueName: \"kubernetes.io/projected/dd391ba4-f3a3-46ae-911d-2fdbe084e782-kube-api-access-9xqg9\") pod \"redhat-marketplace-s795h\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.374460 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:43 crc kubenswrapper[4725]: I0227 07:16:43.913980 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s795h"] Feb 27 07:16:44 crc kubenswrapper[4725]: I0227 07:16:44.914850 4725 generic.go:334] "Generic (PLEG): container finished" podID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerID="100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3" exitCode=0 Feb 27 07:16:44 crc kubenswrapper[4725]: I0227 07:16:44.914917 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s795h" event={"ID":"dd391ba4-f3a3-46ae-911d-2fdbe084e782","Type":"ContainerDied","Data":"100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3"} Feb 27 07:16:44 crc kubenswrapper[4725]: I0227 07:16:44.915363 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s795h" event={"ID":"dd391ba4-f3a3-46ae-911d-2fdbe084e782","Type":"ContainerStarted","Data":"41c0a5dd5204af559e222b1f7ed8bdf9f23615d64e39c81617cb3057d9c9e6ac"} Feb 27 07:16:45 crc kubenswrapper[4725]: I0227 07:16:45.927979 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s795h" event={"ID":"dd391ba4-f3a3-46ae-911d-2fdbe084e782","Type":"ContainerStarted","Data":"949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b"} Feb 27 07:16:46 crc kubenswrapper[4725]: I0227 07:16:46.937947 4725 generic.go:334] "Generic (PLEG): container finished" podID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerID="949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b" exitCode=0 Feb 27 07:16:46 crc kubenswrapper[4725]: I0227 07:16:46.938255 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s795h" event={"ID":"dd391ba4-f3a3-46ae-911d-2fdbe084e782","Type":"ContainerDied","Data":"949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b"} Feb 27 07:16:47 crc kubenswrapper[4725]: I0227 07:16:47.951436 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s795h" event={"ID":"dd391ba4-f3a3-46ae-911d-2fdbe084e782","Type":"ContainerStarted","Data":"befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed"} Feb 27 07:16:47 crc kubenswrapper[4725]: I0227 07:16:47.978085 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s795h" podStartSLOduration=3.565152856 podStartE2EDuration="5.978062834s" podCreationTimestamp="2026-02-27 07:16:42 +0000 UTC" firstStartedPulling="2026-02-27 07:16:44.919518688 +0000 UTC m=+3983.382139257" lastFinishedPulling="2026-02-27 07:16:47.332428646 +0000 UTC m=+3985.795049235" observedRunningTime="2026-02-27 07:16:47.976748047 +0000 UTC m=+3986.439368616" watchObservedRunningTime="2026-02-27 07:16:47.978062834 +0000 UTC m=+3986.440683403" Feb 27 07:16:53 crc kubenswrapper[4725]: I0227 07:16:53.374584 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:53 crc kubenswrapper[4725]: I0227 07:16:53.375256 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:53 crc kubenswrapper[4725]: I0227 07:16:53.438240 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:54 crc kubenswrapper[4725]: I0227 07:16:54.077989 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:54 crc kubenswrapper[4725]: I0227 07:16:54.129753 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s795h"] Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.053801 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s795h" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="registry-server" containerID="cri-o://befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed" gracePeriod=2 Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.607324 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.708017 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-utilities\") pod \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.708180 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xqg9\" (UniqueName: \"kubernetes.io/projected/dd391ba4-f3a3-46ae-911d-2fdbe084e782-kube-api-access-9xqg9\") pod \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.708372 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-catalog-content\") pod \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\" (UID: \"dd391ba4-f3a3-46ae-911d-2fdbe084e782\") " Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.708969 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-utilities" (OuterVolumeSpecName: "utilities") pod "dd391ba4-f3a3-46ae-911d-2fdbe084e782" (UID: "dd391ba4-f3a3-46ae-911d-2fdbe084e782"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.715995 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd391ba4-f3a3-46ae-911d-2fdbe084e782-kube-api-access-9xqg9" (OuterVolumeSpecName: "kube-api-access-9xqg9") pod "dd391ba4-f3a3-46ae-911d-2fdbe084e782" (UID: "dd391ba4-f3a3-46ae-911d-2fdbe084e782"). InnerVolumeSpecName "kube-api-access-9xqg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.733147 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd391ba4-f3a3-46ae-911d-2fdbe084e782" (UID: "dd391ba4-f3a3-46ae-911d-2fdbe084e782"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.811191 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.811226 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xqg9\" (UniqueName: \"kubernetes.io/projected/dd391ba4-f3a3-46ae-911d-2fdbe084e782-kube-api-access-9xqg9\") on node \"crc\" DevicePath \"\"" Feb 27 07:16:56 crc kubenswrapper[4725]: I0227 07:16:56.811237 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd391ba4-f3a3-46ae-911d-2fdbe084e782-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.068438 4725 generic.go:334] "Generic (PLEG): container finished" podID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerID="befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed" exitCode=0 Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.068539 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s795h" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.068560 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s795h" event={"ID":"dd391ba4-f3a3-46ae-911d-2fdbe084e782","Type":"ContainerDied","Data":"befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed"} Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.069020 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s795h" event={"ID":"dd391ba4-f3a3-46ae-911d-2fdbe084e782","Type":"ContainerDied","Data":"41c0a5dd5204af559e222b1f7ed8bdf9f23615d64e39c81617cb3057d9c9e6ac"} Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.069051 4725 scope.go:117] "RemoveContainer" containerID="befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.097540 4725 scope.go:117] "RemoveContainer" containerID="949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.130719 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s795h"] Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.142664 4725 scope.go:117] "RemoveContainer" containerID="100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.143258 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s795h"] Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.195494 4725 scope.go:117] "RemoveContainer" containerID="befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed" Feb 27 07:16:57 crc kubenswrapper[4725]: E0227 07:16:57.196068 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed\": container with ID starting with befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed not found: ID does not exist" containerID="befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.196247 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed"} err="failed to get container status \"befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed\": rpc error: code = NotFound desc = could not find container \"befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed\": container with ID starting with befdf84cd0f7f1bb6a0c765175958dd536a3676e040003b1b9970ed0ab6a29ed not found: ID does not exist" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.196416 4725 scope.go:117] "RemoveContainer" containerID="949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b" Feb 27 07:16:57 crc kubenswrapper[4725]: E0227 07:16:57.197058 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b\": container with ID starting with 949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b not found: ID does not exist" containerID="949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.197182 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b"} err="failed to get container status \"949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b\": rpc error: code = NotFound desc = could not find container \"949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b\": container with ID starting with 949e8c26ea1715a3536f1f2337963b036ea7c47ff461280b6d1c901ec719a83b not found: ID does not exist" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.197307 4725 scope.go:117] "RemoveContainer" containerID="100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3" Feb 27 07:16:57 crc kubenswrapper[4725]: E0227 07:16:57.197677 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3\": container with ID starting with 100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3 not found: ID does not exist" containerID="100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3" Feb 27 07:16:57 crc kubenswrapper[4725]: I0227 07:16:57.197782 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3"} err="failed to get container status \"100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3\": rpc error: code = NotFound desc = could not find container \"100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3\": container with ID starting with 100c6aa97cf22ab75f957e54c96a33c3f406870e732432ae9dfb7227bb1191a3 not found: ID does not exist" Feb 27 07:16:58 crc kubenswrapper[4725]: I0227 07:16:58.271719 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" path="/var/lib/kubelet/pods/dd391ba4-f3a3-46ae-911d-2fdbe084e782/volumes" Feb 27 07:17:06 crc kubenswrapper[4725]: I0227 07:17:06.239038 4725 scope.go:117] "RemoveContainer" containerID="3098b1a24b6bf7af9cb3dfa01637612de8f1157bb55ecb873f63aca375c82682" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.920706 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f9gx7"] Feb 27 07:17:55 crc kubenswrapper[4725]: E0227 07:17:55.921770 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="extract-utilities" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.921787 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="extract-utilities" Feb 27 07:17:55 crc kubenswrapper[4725]: E0227 07:17:55.921831 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="registry-server" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.921839 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="registry-server" Feb 27 07:17:55 crc kubenswrapper[4725]: E0227 07:17:55.921860 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="extract-content" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.921868 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="extract-content" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.922085 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd391ba4-f3a3-46ae-911d-2fdbe084e782" containerName="registry-server" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.924070 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.941955 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9gx7"] Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.967799 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-catalog-content\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.967910 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp9vv\" (UniqueName: \"kubernetes.io/projected/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-kube-api-access-xp9vv\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:55 crc kubenswrapper[4725]: I0227 07:17:55.968011 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-utilities\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.071005 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp9vv\" (UniqueName: \"kubernetes.io/projected/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-kube-api-access-xp9vv\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.071160 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-utilities\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.071622 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-catalog-content\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.071797 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-utilities\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.072079 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-catalog-content\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.099119 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp9vv\" (UniqueName: \"kubernetes.io/projected/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-kube-api-access-xp9vv\") pod \"redhat-operators-f9gx7\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.256326 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:17:56 crc kubenswrapper[4725]: I0227 07:17:56.794433 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9gx7"] Feb 27 07:17:57 crc kubenswrapper[4725]: I0227 07:17:57.756574 4725 generic.go:334] "Generic (PLEG): container finished" podID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerID="34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4" exitCode=0 Feb 27 07:17:57 crc kubenswrapper[4725]: I0227 07:17:57.756693 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9gx7" event={"ID":"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb","Type":"ContainerDied","Data":"34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4"} Feb 27 07:17:57 crc kubenswrapper[4725]: I0227 07:17:57.756964 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9gx7" event={"ID":"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb","Type":"ContainerStarted","Data":"ade90d6cb3256f6fe43690689e98e0d7ceec3564926ed7c305636dc239a60a70"} Feb 27 07:17:57 crc kubenswrapper[4725]: I0227 07:17:57.758694 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:17:58 crc kubenswrapper[4725]: I0227 07:17:58.780241 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9gx7" event={"ID":"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb","Type":"ContainerStarted","Data":"7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6"} Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.178181 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536278-6z9wd"] Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.180950 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536278-6z9wd" Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.186738 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.187865 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.188148 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.201981 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536278-6z9wd"] Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.296082 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8sgc\" (UniqueName: \"kubernetes.io/projected/eb2c8da1-5c71-40c5-92e6-b92b426977f0-kube-api-access-q8sgc\") pod \"auto-csr-approver-29536278-6z9wd\" (UID: \"eb2c8da1-5c71-40c5-92e6-b92b426977f0\") " pod="openshift-infra/auto-csr-approver-29536278-6z9wd" Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.399222 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8sgc\" (UniqueName: \"kubernetes.io/projected/eb2c8da1-5c71-40c5-92e6-b92b426977f0-kube-api-access-q8sgc\") pod \"auto-csr-approver-29536278-6z9wd\" (UID: \"eb2c8da1-5c71-40c5-92e6-b92b426977f0\") " pod="openshift-infra/auto-csr-approver-29536278-6z9wd" Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.422051 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8sgc\" (UniqueName: \"kubernetes.io/projected/eb2c8da1-5c71-40c5-92e6-b92b426977f0-kube-api-access-q8sgc\") pod \"auto-csr-approver-29536278-6z9wd\" (UID: \"eb2c8da1-5c71-40c5-92e6-b92b426977f0\") " pod="openshift-infra/auto-csr-approver-29536278-6z9wd" Feb 27 07:18:00 crc kubenswrapper[4725]: I0227 07:18:00.516044 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536278-6z9wd" Feb 27 07:18:01 crc kubenswrapper[4725]: I0227 07:18:01.054788 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536278-6z9wd"] Feb 27 07:18:01 crc kubenswrapper[4725]: W0227 07:18:01.062929 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb2c8da1_5c71_40c5_92e6_b92b426977f0.slice/crio-74ad5420560125594d662fae8809b509875f6e5a6a4a0c0bb79644545c364c55 WatchSource:0}: Error finding container 74ad5420560125594d662fae8809b509875f6e5a6a4a0c0bb79644545c364c55: Status 404 returned error can't find the container with id 74ad5420560125594d662fae8809b509875f6e5a6a4a0c0bb79644545c364c55 Feb 27 07:18:01 crc kubenswrapper[4725]: I0227 07:18:01.813857 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536278-6z9wd" event={"ID":"eb2c8da1-5c71-40c5-92e6-b92b426977f0","Type":"ContainerStarted","Data":"74ad5420560125594d662fae8809b509875f6e5a6a4a0c0bb79644545c364c55"} Feb 27 07:18:02 crc kubenswrapper[4725]: I0227 07:18:02.824925 4725 generic.go:334] "Generic (PLEG): container finished" podID="eb2c8da1-5c71-40c5-92e6-b92b426977f0" containerID="bcb25cc17ff68025716241fd3c69e6d502681033acf6219313ff44df0d5fe04c" exitCode=0 Feb 27 07:18:02 crc kubenswrapper[4725]: I0227 07:18:02.825031 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536278-6z9wd" event={"ID":"eb2c8da1-5c71-40c5-92e6-b92b426977f0","Type":"ContainerDied","Data":"bcb25cc17ff68025716241fd3c69e6d502681033acf6219313ff44df0d5fe04c"} Feb 27 07:18:03 crc kubenswrapper[4725]: I0227 07:18:03.841992 4725 generic.go:334] "Generic (PLEG): container finished" podID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerID="7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6" exitCode=0 Feb 27 07:18:03 crc kubenswrapper[4725]: I0227 07:18:03.842049 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9gx7" event={"ID":"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb","Type":"ContainerDied","Data":"7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6"} Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.273896 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536278-6z9wd" Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.393640 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8sgc\" (UniqueName: \"kubernetes.io/projected/eb2c8da1-5c71-40c5-92e6-b92b426977f0-kube-api-access-q8sgc\") pod \"eb2c8da1-5c71-40c5-92e6-b92b426977f0\" (UID: \"eb2c8da1-5c71-40c5-92e6-b92b426977f0\") " Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.400985 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2c8da1-5c71-40c5-92e6-b92b426977f0-kube-api-access-q8sgc" (OuterVolumeSpecName: "kube-api-access-q8sgc") pod "eb2c8da1-5c71-40c5-92e6-b92b426977f0" (UID: "eb2c8da1-5c71-40c5-92e6-b92b426977f0"). InnerVolumeSpecName "kube-api-access-q8sgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.497765 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8sgc\" (UniqueName: \"kubernetes.io/projected/eb2c8da1-5c71-40c5-92e6-b92b426977f0-kube-api-access-q8sgc\") on node \"crc\" DevicePath \"\"" Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.854321 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536278-6z9wd" Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.854309 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536278-6z9wd" event={"ID":"eb2c8da1-5c71-40c5-92e6-b92b426977f0","Type":"ContainerDied","Data":"74ad5420560125594d662fae8809b509875f6e5a6a4a0c0bb79644545c364c55"} Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.854507 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ad5420560125594d662fae8809b509875f6e5a6a4a0c0bb79644545c364c55" Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.857176 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9gx7" event={"ID":"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb","Type":"ContainerStarted","Data":"7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255"} Feb 27 07:18:04 crc kubenswrapper[4725]: I0227 07:18:04.884463 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f9gx7" podStartSLOduration=3.416244328 podStartE2EDuration="9.884444497s" podCreationTimestamp="2026-02-27 07:17:55 +0000 UTC" firstStartedPulling="2026-02-27 07:17:57.758511407 +0000 UTC m=+4056.221131976" lastFinishedPulling="2026-02-27 07:18:04.226711576 +0000 UTC m=+4062.689332145" observedRunningTime="2026-02-27 07:18:04.87643739 +0000 UTC m=+4063.339057989" watchObservedRunningTime="2026-02-27 07:18:04.884444497 +0000 UTC m=+4063.347065066" Feb 27 07:18:05 crc kubenswrapper[4725]: I0227 07:18:05.363226 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536272-dcvj4"] Feb 27 07:18:05 crc kubenswrapper[4725]: I0227 07:18:05.375749 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536272-dcvj4"] Feb 27 07:18:06 crc kubenswrapper[4725]: I0227 07:18:06.273619 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e98a3d-4b50-4125-a8bd-f23c983a3606" path="/var/lib/kubelet/pods/91e98a3d-4b50-4125-a8bd-f23c983a3606/volumes" Feb 27 07:18:06 crc kubenswrapper[4725]: I0227 07:18:06.275051 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:18:06 crc kubenswrapper[4725]: I0227 07:18:06.275115 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:18:06 crc kubenswrapper[4725]: I0227 07:18:06.342606 4725 scope.go:117] "RemoveContainer" containerID="8c217c0ee5a3f1ebbb901aede389857cbc95e4886c6a1f05ba07f76e15cf1cd7" Feb 27 07:18:07 crc kubenswrapper[4725]: I0227 07:18:07.349529 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f9gx7" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="registry-server" probeResult="failure" output=< Feb 27 07:18:07 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:18:07 crc kubenswrapper[4725]: > Feb 27 07:18:16 crc kubenswrapper[4725]: I0227 07:18:16.319559 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:18:16 crc kubenswrapper[4725]: I0227 07:18:16.404938 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:18:16 crc kubenswrapper[4725]: I0227 07:18:16.564014 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9gx7"] Feb 27 07:18:17 crc kubenswrapper[4725]: I0227 07:18:17.985251 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f9gx7" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="registry-server" containerID="cri-o://7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255" gracePeriod=2 Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.503943 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.615326 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-utilities\") pod \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.615626 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-catalog-content\") pod \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.615781 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp9vv\" (UniqueName: \"kubernetes.io/projected/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-kube-api-access-xp9vv\") pod \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\" (UID: \"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb\") " Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.616483 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-utilities" (OuterVolumeSpecName: "utilities") pod "64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" (UID: "64d673ce-34b7-48c7-878b-fc4d0c4cb8eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.616749 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.632393 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-kube-api-access-xp9vv" (OuterVolumeSpecName: "kube-api-access-xp9vv") pod "64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" (UID: "64d673ce-34b7-48c7-878b-fc4d0c4cb8eb"). InnerVolumeSpecName "kube-api-access-xp9vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.718470 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp9vv\" (UniqueName: \"kubernetes.io/projected/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-kube-api-access-xp9vv\") on node \"crc\" DevicePath \"\"" Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.748252 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" (UID: "64d673ce-34b7-48c7-878b-fc4d0c4cb8eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.820637 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.998536 4725 generic.go:334] "Generic (PLEG): container finished" podID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerID="7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255" exitCode=0 Feb 27 07:18:18 crc kubenswrapper[4725]: I0227 07:18:18.998605 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9gx7" event={"ID":"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb","Type":"ContainerDied","Data":"7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255"} Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:18.998630 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9gx7" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:18.998648 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9gx7" event={"ID":"64d673ce-34b7-48c7-878b-fc4d0c4cb8eb","Type":"ContainerDied","Data":"ade90d6cb3256f6fe43690689e98e0d7ceec3564926ed7c305636dc239a60a70"} Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:18.998680 4725 scope.go:117] "RemoveContainer" containerID="7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.042215 4725 scope.go:117] "RemoveContainer" containerID="7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.042755 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9gx7"] Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.051222 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f9gx7"] Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.072906 4725 scope.go:117] "RemoveContainer" containerID="34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.153146 4725 scope.go:117] "RemoveContainer" containerID="7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255" Feb 27 07:18:19 crc kubenswrapper[4725]: E0227 07:18:19.154191 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255\": container with ID starting with 7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255 not found: ID does not exist" containerID="7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.154233 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255"} err="failed to get container status \"7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255\": rpc error: code = NotFound desc = could not find container \"7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255\": container with ID starting with 7afee2f42fbb68ccdc5b56a9336ae06f456a5599bef5d1ae7bbcbcaa610e0255 not found: ID does not exist" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.154258 4725 scope.go:117] "RemoveContainer" containerID="7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6" Feb 27 07:18:19 crc kubenswrapper[4725]: E0227 07:18:19.154683 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6\": container with ID starting with 7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6 not found: ID does not exist" containerID="7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.154755 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6"} err="failed to get container status \"7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6\": rpc error: code = NotFound desc = could not find container \"7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6\": container with ID starting with 7198f65c1fba1e81c91008cfae6e37d1a0ce8876333bfc6c79cb290541ca3ae6 not found: ID does not exist" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.154791 4725 scope.go:117] "RemoveContainer" containerID="34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4" Feb 27 07:18:19 crc kubenswrapper[4725]: E0227 07:18:19.155148 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4\": container with ID starting with 34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4 not found: ID does not exist" containerID="34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4" Feb 27 07:18:19 crc kubenswrapper[4725]: I0227 07:18:19.155240 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4"} err="failed to get container status \"34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4\": rpc error: code = NotFound desc = could not find container \"34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4\": container with ID starting with 34d7165e381d14c9930b5e341dd7ae9fe75e18c675229908822ef78e398e2be4 not found: ID does not exist" Feb 27 07:18:20 crc kubenswrapper[4725]: I0227 07:18:20.272253 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" path="/var/lib/kubelet/pods/64d673ce-34b7-48c7-878b-fc4d0c4cb8eb/volumes" Feb 27 07:19:02 crc kubenswrapper[4725]: I0227 07:19:02.588227 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:19:02 crc kubenswrapper[4725]: I0227 07:19:02.588771 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:19:32 crc kubenswrapper[4725]: I0227 07:19:32.554907 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:19:32 crc kubenswrapper[4725]: I0227 07:19:32.555532 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.906742 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cdmlm"] Feb 27 07:19:50 crc kubenswrapper[4725]: E0227 07:19:50.907649 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="extract-content" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.907664 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="extract-content" Feb 27 07:19:50 crc kubenswrapper[4725]: E0227 07:19:50.907678 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="extract-utilities" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.907684 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="extract-utilities" Feb 27 07:19:50 crc kubenswrapper[4725]: E0227 07:19:50.907697 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="registry-server" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.907703 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="registry-server" Feb 27 07:19:50 crc kubenswrapper[4725]: E0227 07:19:50.907730 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2c8da1-5c71-40c5-92e6-b92b426977f0" containerName="oc" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.907736 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2c8da1-5c71-40c5-92e6-b92b426977f0" containerName="oc" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.907924 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2c8da1-5c71-40c5-92e6-b92b426977f0" containerName="oc" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.907943 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d673ce-34b7-48c7-878b-fc4d0c4cb8eb" containerName="registry-server" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.909459 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.920132 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdmlm"] Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.958810 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48g6d\" (UniqueName: \"kubernetes.io/projected/40560461-f37c-4568-9c87-3c2578a66f6d-kube-api-access-48g6d\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.959148 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-catalog-content\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:50 crc kubenswrapper[4725]: I0227 07:19:50.959486 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-utilities\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.061671 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-catalog-content\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.062128 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-utilities\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.062276 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-catalog-content\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.062303 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48g6d\" (UniqueName: \"kubernetes.io/projected/40560461-f37c-4568-9c87-3c2578a66f6d-kube-api-access-48g6d\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.062486 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-utilities\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.092357 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48g6d\" (UniqueName: \"kubernetes.io/projected/40560461-f37c-4568-9c87-3c2578a66f6d-kube-api-access-48g6d\") pod \"community-operators-cdmlm\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.231233 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:19:51 crc kubenswrapper[4725]: I0227 07:19:51.799566 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdmlm"] Feb 27 07:19:52 crc kubenswrapper[4725]: I0227 07:19:52.024648 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmlm" event={"ID":"40560461-f37c-4568-9c87-3c2578a66f6d","Type":"ContainerStarted","Data":"32f7a37c211559ed69602527ce3402b3a4003842f18dc4439563f644a66cb97d"} Feb 27 07:19:52 crc kubenswrapper[4725]: I0227 07:19:52.694497 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sl74x"] Feb 27 07:19:52 crc kubenswrapper[4725]: I0227 07:19:52.698680 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:52 crc kubenswrapper[4725]: I0227 07:19:52.719334 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sl74x"] Feb 27 07:19:52 crc kubenswrapper[4725]: I0227 07:19:52.895854 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d9ca16-8811-47ec-9f57-f3e2e41620be-catalog-content\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:52 crc kubenswrapper[4725]: I0227 07:19:52.895911 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d9ca16-8811-47ec-9f57-f3e2e41620be-utilities\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:52 crc kubenswrapper[4725]: I0227 07:19:52.896301 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67km\" (UniqueName: \"kubernetes.io/projected/88d9ca16-8811-47ec-9f57-f3e2e41620be-kube-api-access-b67km\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.000160 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d9ca16-8811-47ec-9f57-f3e2e41620be-catalog-content\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.000268 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d9ca16-8811-47ec-9f57-f3e2e41620be-utilities\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.000410 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b67km\" (UniqueName: \"kubernetes.io/projected/88d9ca16-8811-47ec-9f57-f3e2e41620be-kube-api-access-b67km\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.000755 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d9ca16-8811-47ec-9f57-f3e2e41620be-catalog-content\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.001047 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d9ca16-8811-47ec-9f57-f3e2e41620be-utilities\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.041762 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67km\" (UniqueName: \"kubernetes.io/projected/88d9ca16-8811-47ec-9f57-f3e2e41620be-kube-api-access-b67km\") pod \"certified-operators-sl74x\" (UID: \"88d9ca16-8811-47ec-9f57-f3e2e41620be\") " pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.050012 4725 generic.go:334] "Generic (PLEG): container finished" podID="40560461-f37c-4568-9c87-3c2578a66f6d" containerID="5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb" exitCode=0 Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.050080 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmlm" event={"ID":"40560461-f37c-4568-9c87-3c2578a66f6d","Type":"ContainerDied","Data":"5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb"} Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.329502 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:19:53 crc kubenswrapper[4725]: I0227 07:19:53.819646 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sl74x"] Feb 27 07:19:54 crc kubenswrapper[4725]: I0227 07:19:54.073272 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl74x" event={"ID":"88d9ca16-8811-47ec-9f57-f3e2e41620be","Type":"ContainerStarted","Data":"e70f7ac4e98a854c8ca945bfcedcd3c7cc28cb3cff475f3b3597aa6856d3df1b"} Feb 27 07:19:55 crc kubenswrapper[4725]: I0227 07:19:55.087959 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmlm" event={"ID":"40560461-f37c-4568-9c87-3c2578a66f6d","Type":"ContainerStarted","Data":"18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1"} Feb 27 07:19:55 crc kubenswrapper[4725]: I0227 07:19:55.091271 4725 generic.go:334] "Generic (PLEG): container finished" podID="88d9ca16-8811-47ec-9f57-f3e2e41620be" containerID="1d4dbde6a8be1f6d511833cd45f57d200cc9150dc9d052ea162f77d2729b5d67" exitCode=0 Feb 27 07:19:55 crc kubenswrapper[4725]: I0227 07:19:55.091329 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl74x" event={"ID":"88d9ca16-8811-47ec-9f57-f3e2e41620be","Type":"ContainerDied","Data":"1d4dbde6a8be1f6d511833cd45f57d200cc9150dc9d052ea162f77d2729b5d67"} Feb 27 07:19:56 crc kubenswrapper[4725]: I0227 07:19:56.111496 4725 generic.go:334] "Generic (PLEG): container finished" podID="40560461-f37c-4568-9c87-3c2578a66f6d" containerID="18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1" exitCode=0 Feb 27 07:19:56 crc kubenswrapper[4725]: I0227 07:19:56.111559 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmlm" event={"ID":"40560461-f37c-4568-9c87-3c2578a66f6d","Type":"ContainerDied","Data":"18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1"} Feb 27 07:19:57 crc kubenswrapper[4725]: I0227 07:19:57.126157 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmlm" event={"ID":"40560461-f37c-4568-9c87-3c2578a66f6d","Type":"ContainerStarted","Data":"285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d"} Feb 27 07:19:57 crc kubenswrapper[4725]: I0227 07:19:57.156159 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cdmlm" podStartSLOduration=3.62887247 podStartE2EDuration="7.156133414s" podCreationTimestamp="2026-02-27 07:19:50 +0000 UTC" firstStartedPulling="2026-02-27 07:19:53.054490021 +0000 UTC m=+4171.517110600" lastFinishedPulling="2026-02-27 07:19:56.581750985 +0000 UTC m=+4175.044371544" observedRunningTime="2026-02-27 07:19:57.149164337 +0000 UTC m=+4175.611784916" watchObservedRunningTime="2026-02-27 07:19:57.156133414 +0000 UTC m=+4175.618753993" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.151469 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536280-r54t2"] Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.153853 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536280-r54t2" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.156639 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.161876 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536280-r54t2"] Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.162022 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.162109 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.256733 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns69b\" (UniqueName: \"kubernetes.io/projected/0a192f82-e759-4c31-9e99-6815a6953484-kube-api-access-ns69b\") pod \"auto-csr-approver-29536280-r54t2\" (UID: \"0a192f82-e759-4c31-9e99-6815a6953484\") " pod="openshift-infra/auto-csr-approver-29536280-r54t2" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.359848 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns69b\" (UniqueName: \"kubernetes.io/projected/0a192f82-e759-4c31-9e99-6815a6953484-kube-api-access-ns69b\") pod \"auto-csr-approver-29536280-r54t2\" (UID: \"0a192f82-e759-4c31-9e99-6815a6953484\") " pod="openshift-infra/auto-csr-approver-29536280-r54t2" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.392607 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns69b\" (UniqueName: \"kubernetes.io/projected/0a192f82-e759-4c31-9e99-6815a6953484-kube-api-access-ns69b\") pod \"auto-csr-approver-29536280-r54t2\" (UID: \"0a192f82-e759-4c31-9e99-6815a6953484\") " pod="openshift-infra/auto-csr-approver-29536280-r54t2" Feb 27 07:20:00 crc kubenswrapper[4725]: I0227 07:20:00.482272 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536280-r54t2" Feb 27 07:20:01 crc kubenswrapper[4725]: I0227 07:20:01.231922 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:20:01 crc kubenswrapper[4725]: I0227 07:20:01.232171 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:20:01 crc kubenswrapper[4725]: I0227 07:20:01.286203 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.238999 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl74x" event={"ID":"88d9ca16-8811-47ec-9f57-f3e2e41620be","Type":"ContainerStarted","Data":"ada75e866c05ce3a7e36e7b5b794624edcf5089668d7583a0d7cd9863b55dfa1"} Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.360412 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.377253 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536280-r54t2"] Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.533153 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdmlm"] Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.554779 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.554867 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.554939 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.555860 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6208386aa9edfdd355d3967a7f5b53c627800b1b18b9a58493de299b226ea063"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:20:02 crc kubenswrapper[4725]: I0227 07:20:02.555934 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://6208386aa9edfdd355d3967a7f5b53c627800b1b18b9a58493de299b226ea063" gracePeriod=600 Feb 27 07:20:03 crc kubenswrapper[4725]: I0227 07:20:03.251629 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="6208386aa9edfdd355d3967a7f5b53c627800b1b18b9a58493de299b226ea063" exitCode=0 Feb 27 07:20:03 crc kubenswrapper[4725]: I0227 07:20:03.251732 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"6208386aa9edfdd355d3967a7f5b53c627800b1b18b9a58493de299b226ea063"} Feb 27 07:20:03 crc kubenswrapper[4725]: I0227 07:20:03.252403 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51"} Feb 27 07:20:03 crc kubenswrapper[4725]: I0227 07:20:03.252436 4725 scope.go:117] "RemoveContainer" containerID="16f33753ab19f1bccb0ed3c2fa7c84c6d5b18e17cc153138813246cc006673bc" Feb 27 07:20:03 crc kubenswrapper[4725]: I0227 07:20:03.256795 4725 generic.go:334] "Generic (PLEG): container finished" podID="88d9ca16-8811-47ec-9f57-f3e2e41620be" containerID="ada75e866c05ce3a7e36e7b5b794624edcf5089668d7583a0d7cd9863b55dfa1" exitCode=0 Feb 27 07:20:03 crc kubenswrapper[4725]: I0227 07:20:03.256832 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl74x" event={"ID":"88d9ca16-8811-47ec-9f57-f3e2e41620be","Type":"ContainerDied","Data":"ada75e866c05ce3a7e36e7b5b794624edcf5089668d7583a0d7cd9863b55dfa1"} Feb 27 07:20:03 crc kubenswrapper[4725]: I0227 07:20:03.258721 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536280-r54t2" event={"ID":"0a192f82-e759-4c31-9e99-6815a6953484","Type":"ContainerStarted","Data":"cfb72f95f94f5e0b8ebe0b537aef6e4993ae2b46d34b80a766979a6a91b78d81"} Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.270271 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cdmlm" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="registry-server" containerID="cri-o://285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d" gracePeriod=2 Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.838445 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.982727 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-utilities\") pod \"40560461-f37c-4568-9c87-3c2578a66f6d\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.982973 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48g6d\" (UniqueName: \"kubernetes.io/projected/40560461-f37c-4568-9c87-3c2578a66f6d-kube-api-access-48g6d\") pod \"40560461-f37c-4568-9c87-3c2578a66f6d\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.983319 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-catalog-content\") pod \"40560461-f37c-4568-9c87-3c2578a66f6d\" (UID: \"40560461-f37c-4568-9c87-3c2578a66f6d\") " Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.983448 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-utilities" (OuterVolumeSpecName: "utilities") pod "40560461-f37c-4568-9c87-3c2578a66f6d" (UID: "40560461-f37c-4568-9c87-3c2578a66f6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.983733 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:20:04 crc kubenswrapper[4725]: I0227 07:20:04.988211 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40560461-f37c-4568-9c87-3c2578a66f6d-kube-api-access-48g6d" (OuterVolumeSpecName: "kube-api-access-48g6d") pod "40560461-f37c-4568-9c87-3c2578a66f6d" (UID: "40560461-f37c-4568-9c87-3c2578a66f6d"). InnerVolumeSpecName "kube-api-access-48g6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.033970 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40560461-f37c-4568-9c87-3c2578a66f6d" (UID: "40560461-f37c-4568-9c87-3c2578a66f6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.085811 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40560461-f37c-4568-9c87-3c2578a66f6d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.085850 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48g6d\" (UniqueName: \"kubernetes.io/projected/40560461-f37c-4568-9c87-3c2578a66f6d-kube-api-access-48g6d\") on node \"crc\" DevicePath \"\"" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.282503 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sl74x" event={"ID":"88d9ca16-8811-47ec-9f57-f3e2e41620be","Type":"ContainerStarted","Data":"94ef8d1074487ef7dc240d3555a3ff17418bdccc860ec4bd1a7b518a229a4da3"} Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.286113 4725 generic.go:334] "Generic (PLEG): container finished" podID="40560461-f37c-4568-9c87-3c2578a66f6d" containerID="285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d" exitCode=0 Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.286180 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmlm" event={"ID":"40560461-f37c-4568-9c87-3c2578a66f6d","Type":"ContainerDied","Data":"285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d"} Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.286181 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmlm" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.286226 4725 scope.go:117] "RemoveContainer" containerID="285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.286210 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmlm" event={"ID":"40560461-f37c-4568-9c87-3c2578a66f6d","Type":"ContainerDied","Data":"32f7a37c211559ed69602527ce3402b3a4003842f18dc4439563f644a66cb97d"} Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.288261 4725 generic.go:334] "Generic (PLEG): container finished" podID="0a192f82-e759-4c31-9e99-6815a6953484" containerID="2e12a9e3813d893c53e89937b9a0ad4093b3f1651b7dd336499df04a73ef51ca" exitCode=0 Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.288320 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536280-r54t2" event={"ID":"0a192f82-e759-4c31-9e99-6815a6953484","Type":"ContainerDied","Data":"2e12a9e3813d893c53e89937b9a0ad4093b3f1651b7dd336499df04a73ef51ca"} Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.321791 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sl74x" podStartSLOduration=4.099670088 podStartE2EDuration="13.321770985s" podCreationTimestamp="2026-02-27 07:19:52 +0000 UTC" firstStartedPulling="2026-02-27 07:19:55.094736633 +0000 UTC m=+4173.557357202" lastFinishedPulling="2026-02-27 07:20:04.31683753 +0000 UTC m=+4182.779458099" observedRunningTime="2026-02-27 07:20:05.309205819 +0000 UTC m=+4183.771826398" watchObservedRunningTime="2026-02-27 07:20:05.321770985 +0000 UTC m=+4183.784391574" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.344867 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdmlm"] Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.348310 4725 scope.go:117] "RemoveContainer" containerID="18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.381369 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cdmlm"] Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.392173 4725 scope.go:117] "RemoveContainer" containerID="5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.442595 4725 scope.go:117] "RemoveContainer" containerID="285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d" Feb 27 07:20:05 crc kubenswrapper[4725]: E0227 07:20:05.443111 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d\": container with ID starting with 285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d not found: ID does not exist" containerID="285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.443160 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d"} err="failed to get container status \"285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d\": rpc error: code = NotFound desc = could not find container \"285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d\": container with ID starting with 285bdc85a4e6621101dced7b50c90c5864811486f805cc95d411db1139163f3d not found: ID does not exist" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.443186 4725 scope.go:117] "RemoveContainer" containerID="18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1" Feb 27 07:20:05 crc kubenswrapper[4725]: E0227 07:20:05.443678 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1\": container with ID starting with 18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1 not found: ID does not exist" containerID="18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.443712 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1"} err="failed to get container status \"18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1\": rpc error: code = NotFound desc = could not find container \"18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1\": container with ID starting with 18057de16ff66bc4ba128fb3011fd1a584512b36862eac3d6bbcbcffcc9225f1 not found: ID does not exist" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.443755 4725 scope.go:117] "RemoveContainer" containerID="5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb" Feb 27 07:20:05 crc kubenswrapper[4725]: E0227 07:20:05.444189 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb\": container with ID starting with 5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb not found: ID does not exist" containerID="5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb" Feb 27 07:20:05 crc kubenswrapper[4725]: I0227 07:20:05.444231 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb"} err="failed to get container status \"5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb\": rpc error: code = NotFound desc = could not find container \"5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb\": container with ID starting with 5fff8caaf24ee24223ffff7a6129d87125f6e2882bb4e39b91c22076319ee8eb not found: ID does not exist" Feb 27 07:20:06 crc kubenswrapper[4725]: I0227 07:20:06.265733 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" path="/var/lib/kubelet/pods/40560461-f37c-4568-9c87-3c2578a66f6d/volumes" Feb 27 07:20:06 crc kubenswrapper[4725]: I0227 07:20:06.666886 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536280-r54t2" Feb 27 07:20:06 crc kubenswrapper[4725]: I0227 07:20:06.827366 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns69b\" (UniqueName: \"kubernetes.io/projected/0a192f82-e759-4c31-9e99-6815a6953484-kube-api-access-ns69b\") pod \"0a192f82-e759-4c31-9e99-6815a6953484\" (UID: \"0a192f82-e759-4c31-9e99-6815a6953484\") " Feb 27 07:20:06 crc kubenswrapper[4725]: I0227 07:20:06.839167 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a192f82-e759-4c31-9e99-6815a6953484-kube-api-access-ns69b" (OuterVolumeSpecName: "kube-api-access-ns69b") pod "0a192f82-e759-4c31-9e99-6815a6953484" (UID: "0a192f82-e759-4c31-9e99-6815a6953484"). InnerVolumeSpecName "kube-api-access-ns69b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:20:06 crc kubenswrapper[4725]: I0227 07:20:06.929854 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns69b\" (UniqueName: \"kubernetes.io/projected/0a192f82-e759-4c31-9e99-6815a6953484-kube-api-access-ns69b\") on node \"crc\" DevicePath \"\"" Feb 27 07:20:07 crc kubenswrapper[4725]: I0227 07:20:07.315946 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536280-r54t2" event={"ID":"0a192f82-e759-4c31-9e99-6815a6953484","Type":"ContainerDied","Data":"cfb72f95f94f5e0b8ebe0b537aef6e4993ae2b46d34b80a766979a6a91b78d81"} Feb 27 07:20:07 crc kubenswrapper[4725]: I0227 07:20:07.316434 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb72f95f94f5e0b8ebe0b537aef6e4993ae2b46d34b80a766979a6a91b78d81" Feb 27 07:20:07 crc kubenswrapper[4725]: I0227 07:20:07.316022 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536280-r54t2" Feb 27 07:20:07 crc kubenswrapper[4725]: I0227 07:20:07.735240 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536274-jhndd"] Feb 27 07:20:07 crc kubenswrapper[4725]: I0227 07:20:07.745471 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536274-jhndd"] Feb 27 07:20:08 crc kubenswrapper[4725]: I0227 07:20:08.261555 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9273eb45-5505-47f1-83d9-cdde7774cb2a" path="/var/lib/kubelet/pods/9273eb45-5505-47f1-83d9-cdde7774cb2a/volumes" Feb 27 07:20:13 crc kubenswrapper[4725]: I0227 07:20:13.330490 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:20:13 crc kubenswrapper[4725]: I0227 07:20:13.330983 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:20:13 crc kubenswrapper[4725]: I0227 07:20:13.401206 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:20:13 crc kubenswrapper[4725]: I0227 07:20:13.479719 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sl74x" Feb 27 07:20:13 crc kubenswrapper[4725]: I0227 07:20:13.555094 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sl74x"] Feb 27 07:20:13 crc kubenswrapper[4725]: I0227 07:20:13.651510 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnhx6"] Feb 27 07:20:13 crc kubenswrapper[4725]: I0227 07:20:13.651827 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gnhx6" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="registry-server" containerID="cri-o://c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18" gracePeriod=2 Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.256663 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.389903 4725 generic.go:334] "Generic (PLEG): container finished" podID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerID="c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18" exitCode=0 Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.389982 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnhx6" event={"ID":"eb9bdb7d-d8b8-4937-9970-f32ee3abe121","Type":"ContainerDied","Data":"c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18"} Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.390021 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gnhx6" event={"ID":"eb9bdb7d-d8b8-4937-9970-f32ee3abe121","Type":"ContainerDied","Data":"5de2619278768d736effbd06aa3cef62c227307e79b191ec495497c096293a08"} Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.390042 4725 scope.go:117] "RemoveContainer" containerID="c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.390155 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gnhx6" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.394948 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-catalog-content\") pod \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.395028 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfw4\" (UniqueName: \"kubernetes.io/projected/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-kube-api-access-jwfw4\") pod \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.395108 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-utilities\") pod \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\" (UID: \"eb9bdb7d-d8b8-4937-9970-f32ee3abe121\") " Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.399223 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-utilities" (OuterVolumeSpecName: "utilities") pod "eb9bdb7d-d8b8-4937-9970-f32ee3abe121" (UID: "eb9bdb7d-d8b8-4937-9970-f32ee3abe121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.412550 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-kube-api-access-jwfw4" (OuterVolumeSpecName: "kube-api-access-jwfw4") pod "eb9bdb7d-d8b8-4937-9970-f32ee3abe121" (UID: "eb9bdb7d-d8b8-4937-9970-f32ee3abe121"). InnerVolumeSpecName "kube-api-access-jwfw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.436820 4725 scope.go:117] "RemoveContainer" containerID="6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.477513 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb9bdb7d-d8b8-4937-9970-f32ee3abe121" (UID: "eb9bdb7d-d8b8-4937-9970-f32ee3abe121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.494924 4725 scope.go:117] "RemoveContainer" containerID="8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.502591 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.502634 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwfw4\" (UniqueName: \"kubernetes.io/projected/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-kube-api-access-jwfw4\") on node \"crc\" DevicePath \"\"" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.502651 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb9bdb7d-d8b8-4937-9970-f32ee3abe121-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.589048 4725 scope.go:117] "RemoveContainer" containerID="c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18" Feb 27 07:20:14 crc kubenswrapper[4725]: E0227 07:20:14.593514 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18\": container with ID starting with c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18 not found: ID does not exist" containerID="c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.593563 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18"} err="failed to get container status \"c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18\": rpc error: code = NotFound desc = could not find container \"c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18\": container with ID starting with c9469ee933edf4542f79de210fb7d21054307aa0e1658a30df925ddad9eaeb18 not found: ID does not exist" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.593587 4725 scope.go:117] "RemoveContainer" containerID="6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4" Feb 27 07:20:14 crc kubenswrapper[4725]: E0227 07:20:14.596870 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4\": container with ID starting with 6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4 not found: ID does not exist" containerID="6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.596897 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4"} err="failed to get container status \"6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4\": rpc error: code = NotFound desc = could not find container \"6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4\": container with ID starting with 6dd9f5f1504eca40dd937e650d0f7926b928afd813ad7b970eb3325ebd8d05f4 not found: ID does not exist" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.596913 4725 scope.go:117] "RemoveContainer" containerID="8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e" Feb 27 07:20:14 crc kubenswrapper[4725]: E0227 07:20:14.600352 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e\": container with ID starting with 8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e not found: ID does not exist" containerID="8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.600377 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e"} err="failed to get container status \"8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e\": rpc error: code = NotFound desc = could not find container \"8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e\": container with ID starting with 8300c0a74dd6d1ba2b8eb5891fa8abc05138875948ed95caac62d6727128c42e not found: ID does not exist" Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.740838 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gnhx6"] Feb 27 07:20:14 crc kubenswrapper[4725]: I0227 07:20:14.763150 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gnhx6"] Feb 27 07:20:16 crc kubenswrapper[4725]: I0227 07:20:16.264883 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" path="/var/lib/kubelet/pods/eb9bdb7d-d8b8-4937-9970-f32ee3abe121/volumes" Feb 27 07:20:44 crc kubenswrapper[4725]: I0227 07:20:44.775825 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="76702ae7-c9e6-485b-abc9-b54e4c073ee1" containerName="galera" probeResult="failure" output="command timed out" Feb 27 07:21:06 crc kubenswrapper[4725]: I0227 07:21:06.515702 4725 scope.go:117] "RemoveContainer" containerID="92eeb95ae20328609d196d9049fd05ff8ae0b5df1f7ff202a8bb0f4b84bf1704" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.158059 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536282-8x8ns"] Feb 27 07:22:00 crc kubenswrapper[4725]: E0227 07:22:00.159402 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="registry-server" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159425 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="registry-server" Feb 27 07:22:00 crc kubenswrapper[4725]: E0227 07:22:00.159457 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="extract-utilities" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159467 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="extract-utilities" Feb 27 07:22:00 crc kubenswrapper[4725]: E0227 07:22:00.159496 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="extract-utilities" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159509 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="extract-utilities" Feb 27 07:22:00 crc kubenswrapper[4725]: E0227 07:22:00.159549 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a192f82-e759-4c31-9e99-6815a6953484" containerName="oc" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159559 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a192f82-e759-4c31-9e99-6815a6953484" containerName="oc" Feb 27 07:22:00 crc kubenswrapper[4725]: E0227 07:22:00.159577 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="registry-server" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159586 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="registry-server" Feb 27 07:22:00 crc kubenswrapper[4725]: E0227 07:22:00.159601 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="extract-content" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159612 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="extract-content" Feb 27 07:22:00 crc kubenswrapper[4725]: E0227 07:22:00.159631 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="extract-content" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159642 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="extract-content" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159967 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9bdb7d-d8b8-4937-9970-f32ee3abe121" containerName="registry-server" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.159990 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a192f82-e759-4c31-9e99-6815a6953484" containerName="oc" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.160016 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="40560461-f37c-4568-9c87-3c2578a66f6d" containerName="registry-server" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.161475 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.164326 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.164922 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.181434 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.188326 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536282-8x8ns"] Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.264768 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcmz7\" (UniqueName: \"kubernetes.io/projected/ce9e4130-3836-4f21-a186-ce361727c44a-kube-api-access-dcmz7\") pod \"auto-csr-approver-29536282-8x8ns\" (UID: \"ce9e4130-3836-4f21-a186-ce361727c44a\") " pod="openshift-infra/auto-csr-approver-29536282-8x8ns" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.366792 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcmz7\" (UniqueName: \"kubernetes.io/projected/ce9e4130-3836-4f21-a186-ce361727c44a-kube-api-access-dcmz7\") pod \"auto-csr-approver-29536282-8x8ns\" (UID: \"ce9e4130-3836-4f21-a186-ce361727c44a\") " pod="openshift-infra/auto-csr-approver-29536282-8x8ns" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.389982 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcmz7\" (UniqueName: \"kubernetes.io/projected/ce9e4130-3836-4f21-a186-ce361727c44a-kube-api-access-dcmz7\") pod \"auto-csr-approver-29536282-8x8ns\" (UID: \"ce9e4130-3836-4f21-a186-ce361727c44a\") " pod="openshift-infra/auto-csr-approver-29536282-8x8ns" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.496486 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" Feb 27 07:22:00 crc kubenswrapper[4725]: I0227 07:22:00.969384 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536282-8x8ns"] Feb 27 07:22:00 crc kubenswrapper[4725]: W0227 07:22:00.979261 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce9e4130_3836_4f21_a186_ce361727c44a.slice/crio-94e4d898e21f8cd684abdb5ef568d20fcfef698a5ab124dd67506bda8d8376f3 WatchSource:0}: Error finding container 94e4d898e21f8cd684abdb5ef568d20fcfef698a5ab124dd67506bda8d8376f3: Status 404 returned error can't find the container with id 94e4d898e21f8cd684abdb5ef568d20fcfef698a5ab124dd67506bda8d8376f3 Feb 27 07:22:01 crc kubenswrapper[4725]: I0227 07:22:01.460668 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" event={"ID":"ce9e4130-3836-4f21-a186-ce361727c44a","Type":"ContainerStarted","Data":"94e4d898e21f8cd684abdb5ef568d20fcfef698a5ab124dd67506bda8d8376f3"} Feb 27 07:22:02 crc kubenswrapper[4725]: I0227 07:22:02.474361 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" event={"ID":"ce9e4130-3836-4f21-a186-ce361727c44a","Type":"ContainerStarted","Data":"01c2a059ccc79fc6d086ed88bf49e382019a15a3d57b9f3e929ea6dc98be162e"} Feb 27 07:22:02 crc kubenswrapper[4725]: I0227 07:22:02.515957 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" podStartSLOduration=1.632600922 podStartE2EDuration="2.515932184s" podCreationTimestamp="2026-02-27 07:22:00 +0000 UTC" firstStartedPulling="2026-02-27 07:22:01.001565488 +0000 UTC m=+4299.464186077" lastFinishedPulling="2026-02-27 07:22:01.88489676 +0000 UTC m=+4300.347517339" observedRunningTime="2026-02-27 07:22:02.496887395 +0000 UTC m=+4300.959507974" watchObservedRunningTime="2026-02-27 07:22:02.515932184 +0000 UTC m=+4300.978552753" Feb 27 07:22:02 crc kubenswrapper[4725]: I0227 07:22:02.553987 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:22:02 crc kubenswrapper[4725]: I0227 07:22:02.554039 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:22:03 crc kubenswrapper[4725]: I0227 07:22:03.485269 4725 generic.go:334] "Generic (PLEG): container finished" podID="ce9e4130-3836-4f21-a186-ce361727c44a" containerID="01c2a059ccc79fc6d086ed88bf49e382019a15a3d57b9f3e929ea6dc98be162e" exitCode=0 Feb 27 07:22:03 crc kubenswrapper[4725]: I0227 07:22:03.485324 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" event={"ID":"ce9e4130-3836-4f21-a186-ce361727c44a","Type":"ContainerDied","Data":"01c2a059ccc79fc6d086ed88bf49e382019a15a3d57b9f3e929ea6dc98be162e"} Feb 27 07:22:04 crc kubenswrapper[4725]: I0227 07:22:04.871093 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" Feb 27 07:22:04 crc kubenswrapper[4725]: I0227 07:22:04.890400 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcmz7\" (UniqueName: \"kubernetes.io/projected/ce9e4130-3836-4f21-a186-ce361727c44a-kube-api-access-dcmz7\") pod \"ce9e4130-3836-4f21-a186-ce361727c44a\" (UID: \"ce9e4130-3836-4f21-a186-ce361727c44a\") " Feb 27 07:22:04 crc kubenswrapper[4725]: I0227 07:22:04.896002 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9e4130-3836-4f21-a186-ce361727c44a-kube-api-access-dcmz7" (OuterVolumeSpecName: "kube-api-access-dcmz7") pod "ce9e4130-3836-4f21-a186-ce361727c44a" (UID: "ce9e4130-3836-4f21-a186-ce361727c44a"). InnerVolumeSpecName "kube-api-access-dcmz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:22:04 crc kubenswrapper[4725]: I0227 07:22:04.992739 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcmz7\" (UniqueName: \"kubernetes.io/projected/ce9e4130-3836-4f21-a186-ce361727c44a-kube-api-access-dcmz7\") on node \"crc\" DevicePath \"\"" Feb 27 07:22:05 crc kubenswrapper[4725]: I0227 07:22:05.358940 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536276-w95m8"] Feb 27 07:22:05 crc kubenswrapper[4725]: I0227 07:22:05.376701 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536276-w95m8"] Feb 27 07:22:05 crc kubenswrapper[4725]: I0227 07:22:05.502645 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" event={"ID":"ce9e4130-3836-4f21-a186-ce361727c44a","Type":"ContainerDied","Data":"94e4d898e21f8cd684abdb5ef568d20fcfef698a5ab124dd67506bda8d8376f3"} Feb 27 07:22:05 crc kubenswrapper[4725]: I0227 07:22:05.502691 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e4d898e21f8cd684abdb5ef568d20fcfef698a5ab124dd67506bda8d8376f3" Feb 27 07:22:05 crc kubenswrapper[4725]: I0227 07:22:05.502719 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536282-8x8ns" Feb 27 07:22:06 crc kubenswrapper[4725]: I0227 07:22:06.274360 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff88595-4388-4081-85f1-51a28a80e5d3" path="/var/lib/kubelet/pods/dff88595-4388-4081-85f1-51a28a80e5d3/volumes" Feb 27 07:22:06 crc kubenswrapper[4725]: I0227 07:22:06.630890 4725 scope.go:117] "RemoveContainer" containerID="6c06564b8af63d71533db047b5d78ca3781a843280b05178db6c0abe6733cb9f" Feb 27 07:22:32 crc kubenswrapper[4725]: I0227 07:22:32.554942 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:22:32 crc kubenswrapper[4725]: I0227 07:22:32.555711 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:23:02 crc kubenswrapper[4725]: I0227 07:23:02.554417 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:23:02 crc kubenswrapper[4725]: I0227 07:23:02.554986 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:23:02 crc kubenswrapper[4725]: I0227 07:23:02.555143 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:23:02 crc kubenswrapper[4725]: I0227 07:23:02.556356 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:23:02 crc kubenswrapper[4725]: I0227 07:23:02.556469 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" gracePeriod=600 Feb 27 07:23:02 crc kubenswrapper[4725]: E0227 07:23:02.683337 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:23:03 crc kubenswrapper[4725]: I0227 07:23:03.189556 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" exitCode=0 Feb 27 07:23:03 crc kubenswrapper[4725]: I0227 07:23:03.189603 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51"} Feb 27 07:23:03 crc kubenswrapper[4725]: I0227 07:23:03.189640 4725 scope.go:117] "RemoveContainer" containerID="6208386aa9edfdd355d3967a7f5b53c627800b1b18b9a58493de299b226ea063" Feb 27 07:23:03 crc kubenswrapper[4725]: I0227 07:23:03.190625 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:23:03 crc kubenswrapper[4725]: E0227 07:23:03.192674 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:23:17 crc kubenswrapper[4725]: I0227 07:23:17.251800 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:23:17 crc kubenswrapper[4725]: E0227 07:23:17.252749 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:23:29 crc kubenswrapper[4725]: I0227 07:23:29.255579 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:23:29 crc kubenswrapper[4725]: E0227 07:23:29.256712 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:23:44 crc kubenswrapper[4725]: I0227 07:23:44.252522 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:23:44 crc kubenswrapper[4725]: E0227 07:23:44.253429 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:23:59 crc kubenswrapper[4725]: I0227 07:23:59.252246 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:23:59 crc kubenswrapper[4725]: E0227 07:23:59.252994 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.138535 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536284-fkszf"] Feb 27 07:24:00 crc kubenswrapper[4725]: E0227 07:24:00.139151 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9e4130-3836-4f21-a186-ce361727c44a" containerName="oc" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.139174 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9e4130-3836-4f21-a186-ce361727c44a" containerName="oc" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.139398 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9e4130-3836-4f21-a186-ce361727c44a" containerName="oc" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.140057 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536284-fkszf" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.142173 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.142740 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.153649 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.157471 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536284-fkszf"] Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.317691 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zph\" (UniqueName: \"kubernetes.io/projected/4ac7f261-80f3-4075-9f99-069ba18a12ff-kube-api-access-s9zph\") pod \"auto-csr-approver-29536284-fkszf\" (UID: \"4ac7f261-80f3-4075-9f99-069ba18a12ff\") " pod="openshift-infra/auto-csr-approver-29536284-fkszf" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.420945 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zph\" (UniqueName: \"kubernetes.io/projected/4ac7f261-80f3-4075-9f99-069ba18a12ff-kube-api-access-s9zph\") pod \"auto-csr-approver-29536284-fkszf\" (UID: \"4ac7f261-80f3-4075-9f99-069ba18a12ff\") " pod="openshift-infra/auto-csr-approver-29536284-fkszf" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.443902 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zph\" (UniqueName: \"kubernetes.io/projected/4ac7f261-80f3-4075-9f99-069ba18a12ff-kube-api-access-s9zph\") pod \"auto-csr-approver-29536284-fkszf\" (UID: \"4ac7f261-80f3-4075-9f99-069ba18a12ff\") " pod="openshift-infra/auto-csr-approver-29536284-fkszf" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.475674 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536284-fkszf" Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.980309 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536284-fkszf"] Feb 27 07:24:00 crc kubenswrapper[4725]: I0227 07:24:00.985774 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:24:01 crc kubenswrapper[4725]: I0227 07:24:01.115020 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536284-fkszf" event={"ID":"4ac7f261-80f3-4075-9f99-069ba18a12ff","Type":"ContainerStarted","Data":"b499789e5c239ca914076555cff56514859a582544e0743093d50a27bd141882"} Feb 27 07:24:03 crc kubenswrapper[4725]: I0227 07:24:03.150629 4725 generic.go:334] "Generic (PLEG): container finished" podID="4ac7f261-80f3-4075-9f99-069ba18a12ff" containerID="9318b5132efeae727ae09e482eda2b315ab4e68cc5704f6de3f2c162198a76d6" exitCode=0 Feb 27 07:24:03 crc kubenswrapper[4725]: I0227 07:24:03.151273 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536284-fkszf" event={"ID":"4ac7f261-80f3-4075-9f99-069ba18a12ff","Type":"ContainerDied","Data":"9318b5132efeae727ae09e482eda2b315ab4e68cc5704f6de3f2c162198a76d6"} Feb 27 07:24:04 crc kubenswrapper[4725]: I0227 07:24:04.683812 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536284-fkszf" Feb 27 07:24:04 crc kubenswrapper[4725]: I0227 07:24:04.740461 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9zph\" (UniqueName: \"kubernetes.io/projected/4ac7f261-80f3-4075-9f99-069ba18a12ff-kube-api-access-s9zph\") pod \"4ac7f261-80f3-4075-9f99-069ba18a12ff\" (UID: \"4ac7f261-80f3-4075-9f99-069ba18a12ff\") " Feb 27 07:24:04 crc kubenswrapper[4725]: I0227 07:24:04.752645 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac7f261-80f3-4075-9f99-069ba18a12ff-kube-api-access-s9zph" (OuterVolumeSpecName: "kube-api-access-s9zph") pod "4ac7f261-80f3-4075-9f99-069ba18a12ff" (UID: "4ac7f261-80f3-4075-9f99-069ba18a12ff"). InnerVolumeSpecName "kube-api-access-s9zph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:24:04 crc kubenswrapper[4725]: I0227 07:24:04.842483 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9zph\" (UniqueName: \"kubernetes.io/projected/4ac7f261-80f3-4075-9f99-069ba18a12ff-kube-api-access-s9zph\") on node \"crc\" DevicePath \"\"" Feb 27 07:24:05 crc kubenswrapper[4725]: I0227 07:24:05.209314 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536284-fkszf" event={"ID":"4ac7f261-80f3-4075-9f99-069ba18a12ff","Type":"ContainerDied","Data":"b499789e5c239ca914076555cff56514859a582544e0743093d50a27bd141882"} Feb 27 07:24:05 crc kubenswrapper[4725]: I0227 07:24:05.209407 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b499789e5c239ca914076555cff56514859a582544e0743093d50a27bd141882" Feb 27 07:24:05 crc kubenswrapper[4725]: I0227 07:24:05.209354 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536284-fkszf" Feb 27 07:24:05 crc kubenswrapper[4725]: I0227 07:24:05.779402 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536278-6z9wd"] Feb 27 07:24:05 crc kubenswrapper[4725]: I0227 07:24:05.791883 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536278-6z9wd"] Feb 27 07:24:06 crc kubenswrapper[4725]: I0227 07:24:06.270927 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2c8da1-5c71-40c5-92e6-b92b426977f0" path="/var/lib/kubelet/pods/eb2c8da1-5c71-40c5-92e6-b92b426977f0/volumes" Feb 27 07:24:07 crc kubenswrapper[4725]: I0227 07:24:07.287323 4725 scope.go:117] "RemoveContainer" containerID="bcb25cc17ff68025716241fd3c69e6d502681033acf6219313ff44df0d5fe04c" Feb 27 07:24:11 crc kubenswrapper[4725]: I0227 07:24:11.251772 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:24:11 crc kubenswrapper[4725]: E0227 07:24:11.252734 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:24:26 crc kubenswrapper[4725]: I0227 07:24:26.251857 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:24:26 crc kubenswrapper[4725]: E0227 07:24:26.252558 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:24:41 crc kubenswrapper[4725]: I0227 07:24:41.251512 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:24:41 crc kubenswrapper[4725]: E0227 07:24:41.252238 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:24:56 crc kubenswrapper[4725]: I0227 07:24:56.252168 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:24:56 crc kubenswrapper[4725]: E0227 07:24:56.258623 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:25:08 crc kubenswrapper[4725]: I0227 07:25:08.251848 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:25:08 crc kubenswrapper[4725]: E0227 07:25:08.252572 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:25:19 crc kubenswrapper[4725]: I0227 07:25:19.256392 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:25:19 crc kubenswrapper[4725]: E0227 07:25:19.259518 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:25:32 crc kubenswrapper[4725]: I0227 07:25:32.261915 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:25:32 crc kubenswrapper[4725]: E0227 07:25:32.262929 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:25:46 crc kubenswrapper[4725]: I0227 07:25:46.252268 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:25:46 crc kubenswrapper[4725]: E0227 07:25:46.253076 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:25:58 crc kubenswrapper[4725]: I0227 07:25:58.253081 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:25:58 crc kubenswrapper[4725]: E0227 07:25:58.253952 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.153985 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536286-99n9t"] Feb 27 07:26:00 crc kubenswrapper[4725]: E0227 07:26:00.155155 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac7f261-80f3-4075-9f99-069ba18a12ff" containerName="oc" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.155178 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac7f261-80f3-4075-9f99-069ba18a12ff" containerName="oc" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.155608 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac7f261-80f3-4075-9f99-069ba18a12ff" containerName="oc" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.156577 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536286-99n9t" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.160580 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.161248 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.161979 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.166112 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536286-99n9t"] Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.318933 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bqn\" (UniqueName: \"kubernetes.io/projected/8c7fd961-7665-421b-8680-f8468e642650-kube-api-access-s9bqn\") pod \"auto-csr-approver-29536286-99n9t\" (UID: \"8c7fd961-7665-421b-8680-f8468e642650\") " pod="openshift-infra/auto-csr-approver-29536286-99n9t" Feb 27 07:26:00 crc kubenswrapper[4725]: I0227 07:26:00.426071 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bqn\" (UniqueName: \"kubernetes.io/projected/8c7fd961-7665-421b-8680-f8468e642650-kube-api-access-s9bqn\") pod \"auto-csr-approver-29536286-99n9t\" (UID: \"8c7fd961-7665-421b-8680-f8468e642650\") " pod="openshift-infra/auto-csr-approver-29536286-99n9t" Feb 27 07:26:01 crc kubenswrapper[4725]: I0227 07:26:01.276323 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bqn\" (UniqueName: \"kubernetes.io/projected/8c7fd961-7665-421b-8680-f8468e642650-kube-api-access-s9bqn\") pod \"auto-csr-approver-29536286-99n9t\" (UID: \"8c7fd961-7665-421b-8680-f8468e642650\") " pod="openshift-infra/auto-csr-approver-29536286-99n9t" Feb 27 07:26:01 crc kubenswrapper[4725]: I0227 07:26:01.375246 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536286-99n9t" Feb 27 07:26:01 crc kubenswrapper[4725]: I0227 07:26:01.871068 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536286-99n9t"] Feb 27 07:26:02 crc kubenswrapper[4725]: I0227 07:26:02.226688 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536286-99n9t" event={"ID":"8c7fd961-7665-421b-8680-f8468e642650","Type":"ContainerStarted","Data":"e294fe428eb836ef8f5bacd8ed87ec2dc73723c2113790a8c0034a808e8a86c6"} Feb 27 07:26:04 crc kubenswrapper[4725]: I0227 07:26:04.245222 4725 generic.go:334] "Generic (PLEG): container finished" podID="8c7fd961-7665-421b-8680-f8468e642650" containerID="3d117b165f8da445983d5fd3862a89f2ecb9fd2df77869e8e74eeec127b2a6f5" exitCode=0 Feb 27 07:26:04 crc kubenswrapper[4725]: I0227 07:26:04.245307 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536286-99n9t" event={"ID":"8c7fd961-7665-421b-8680-f8468e642650","Type":"ContainerDied","Data":"3d117b165f8da445983d5fd3862a89f2ecb9fd2df77869e8e74eeec127b2a6f5"} Feb 27 07:26:05 crc kubenswrapper[4725]: I0227 07:26:05.663707 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536286-99n9t" Feb 27 07:26:05 crc kubenswrapper[4725]: I0227 07:26:05.753892 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9bqn\" (UniqueName: \"kubernetes.io/projected/8c7fd961-7665-421b-8680-f8468e642650-kube-api-access-s9bqn\") pod \"8c7fd961-7665-421b-8680-f8468e642650\" (UID: \"8c7fd961-7665-421b-8680-f8468e642650\") " Feb 27 07:26:05 crc kubenswrapper[4725]: I0227 07:26:05.759874 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7fd961-7665-421b-8680-f8468e642650-kube-api-access-s9bqn" (OuterVolumeSpecName: "kube-api-access-s9bqn") pod "8c7fd961-7665-421b-8680-f8468e642650" (UID: "8c7fd961-7665-421b-8680-f8468e642650"). InnerVolumeSpecName "kube-api-access-s9bqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:26:05 crc kubenswrapper[4725]: I0227 07:26:05.856611 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9bqn\" (UniqueName: \"kubernetes.io/projected/8c7fd961-7665-421b-8680-f8468e642650-kube-api-access-s9bqn\") on node \"crc\" DevicePath \"\"" Feb 27 07:26:06 crc kubenswrapper[4725]: I0227 07:26:06.274372 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536286-99n9t" event={"ID":"8c7fd961-7665-421b-8680-f8468e642650","Type":"ContainerDied","Data":"e294fe428eb836ef8f5bacd8ed87ec2dc73723c2113790a8c0034a808e8a86c6"} Feb 27 07:26:06 crc kubenswrapper[4725]: I0227 07:26:06.274429 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536286-99n9t" Feb 27 07:26:06 crc kubenswrapper[4725]: I0227 07:26:06.274471 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e294fe428eb836ef8f5bacd8ed87ec2dc73723c2113790a8c0034a808e8a86c6" Feb 27 07:26:06 crc kubenswrapper[4725]: I0227 07:26:06.757522 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536280-r54t2"] Feb 27 07:26:06 crc kubenswrapper[4725]: I0227 07:26:06.767853 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536280-r54t2"] Feb 27 07:26:08 crc kubenswrapper[4725]: I0227 07:26:08.271532 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a192f82-e759-4c31-9e99-6815a6953484" path="/var/lib/kubelet/pods/0a192f82-e759-4c31-9e99-6815a6953484/volumes" Feb 27 07:26:13 crc kubenswrapper[4725]: I0227 07:26:13.251209 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:26:13 crc kubenswrapper[4725]: E0227 07:26:13.252037 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:26:24 crc kubenswrapper[4725]: I0227 07:26:24.251847 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:26:24 crc kubenswrapper[4725]: E0227 07:26:24.252534 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:26:36 crc kubenswrapper[4725]: I0227 07:26:36.251978 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:26:36 crc kubenswrapper[4725]: E0227 07:26:36.253005 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:26:47 crc kubenswrapper[4725]: I0227 07:26:47.252028 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:26:47 crc kubenswrapper[4725]: E0227 07:26:47.253451 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:27:00 crc kubenswrapper[4725]: I0227 07:27:00.251275 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:27:00 crc kubenswrapper[4725]: E0227 07:27:00.252191 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.779438 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rh7jr"] Feb 27 07:27:01 crc kubenswrapper[4725]: E0227 07:27:01.780252 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7fd961-7665-421b-8680-f8468e642650" containerName="oc" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.780267 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7fd961-7665-421b-8680-f8468e642650" containerName="oc" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.780550 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7fd961-7665-421b-8680-f8468e642650" containerName="oc" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.782468 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.792269 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rh7jr"] Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.813919 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v7w6\" (UniqueName: \"kubernetes.io/projected/438c69f1-41f6-4739-9b5d-62e7a1e91449-kube-api-access-8v7w6\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.814115 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-catalog-content\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.814160 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-utilities\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.915959 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-catalog-content\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.916530 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-utilities\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.916628 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-catalog-content\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.916689 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v7w6\" (UniqueName: \"kubernetes.io/projected/438c69f1-41f6-4739-9b5d-62e7a1e91449-kube-api-access-8v7w6\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.917070 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-utilities\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:01 crc kubenswrapper[4725]: I0227 07:27:01.950309 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v7w6\" (UniqueName: \"kubernetes.io/projected/438c69f1-41f6-4739-9b5d-62e7a1e91449-kube-api-access-8v7w6\") pod \"redhat-marketplace-rh7jr\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:02 crc kubenswrapper[4725]: I0227 07:27:02.166080 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:02 crc kubenswrapper[4725]: I0227 07:27:02.622166 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rh7jr"] Feb 27 07:27:02 crc kubenswrapper[4725]: I0227 07:27:02.877304 4725 generic.go:334] "Generic (PLEG): container finished" podID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerID="336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314" exitCode=0 Feb 27 07:27:02 crc kubenswrapper[4725]: I0227 07:27:02.877856 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rh7jr" event={"ID":"438c69f1-41f6-4739-9b5d-62e7a1e91449","Type":"ContainerDied","Data":"336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314"} Feb 27 07:27:02 crc kubenswrapper[4725]: I0227 07:27:02.877930 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rh7jr" event={"ID":"438c69f1-41f6-4739-9b5d-62e7a1e91449","Type":"ContainerStarted","Data":"6f34ad001cb012ab46dd539f6aacaf589f0c7022e3cdd1b5324fcb37e337d2e7"} Feb 27 07:27:04 crc kubenswrapper[4725]: I0227 07:27:04.897436 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rh7jr" event={"ID":"438c69f1-41f6-4739-9b5d-62e7a1e91449","Type":"ContainerStarted","Data":"81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f"} Feb 27 07:27:05 crc kubenswrapper[4725]: I0227 07:27:05.914624 4725 generic.go:334] "Generic (PLEG): container finished" podID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerID="81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f" exitCode=0 Feb 27 07:27:05 crc kubenswrapper[4725]: I0227 07:27:05.915672 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rh7jr" event={"ID":"438c69f1-41f6-4739-9b5d-62e7a1e91449","Type":"ContainerDied","Data":"81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f"} Feb 27 07:27:05 crc kubenswrapper[4725]: I0227 07:27:05.915804 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rh7jr" event={"ID":"438c69f1-41f6-4739-9b5d-62e7a1e91449","Type":"ContainerStarted","Data":"8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784"} Feb 27 07:27:05 crc kubenswrapper[4725]: I0227 07:27:05.962168 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rh7jr" podStartSLOduration=2.557158135 podStartE2EDuration="4.962140489s" podCreationTimestamp="2026-02-27 07:27:01 +0000 UTC" firstStartedPulling="2026-02-27 07:27:02.880094176 +0000 UTC m=+4601.342714755" lastFinishedPulling="2026-02-27 07:27:05.28507653 +0000 UTC m=+4603.747697109" observedRunningTime="2026-02-27 07:27:05.940039633 +0000 UTC m=+4604.402660212" watchObservedRunningTime="2026-02-27 07:27:05.962140489 +0000 UTC m=+4604.424761098" Feb 27 07:27:07 crc kubenswrapper[4725]: I0227 07:27:07.421309 4725 scope.go:117] "RemoveContainer" containerID="2e12a9e3813d893c53e89937b9a0ad4093b3f1651b7dd336499df04a73ef51ca" Feb 27 07:27:12 crc kubenswrapper[4725]: I0227 07:27:12.166681 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:12 crc kubenswrapper[4725]: I0227 07:27:12.167367 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:12 crc kubenswrapper[4725]: I0227 07:27:12.270714 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:13 crc kubenswrapper[4725]: I0227 07:27:13.338970 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:15 crc kubenswrapper[4725]: I0227 07:27:15.252151 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:27:15 crc kubenswrapper[4725]: E0227 07:27:15.252713 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:27:15 crc kubenswrapper[4725]: I0227 07:27:15.355510 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rh7jr"] Feb 27 07:27:15 crc kubenswrapper[4725]: I0227 07:27:15.355724 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rh7jr" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="registry-server" containerID="cri-o://8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784" gracePeriod=2 Feb 27 07:27:15 crc kubenswrapper[4725]: I0227 07:27:15.858344 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.015035 4725 generic.go:334] "Generic (PLEG): container finished" podID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerID="8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784" exitCode=0 Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.015085 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rh7jr" event={"ID":"438c69f1-41f6-4739-9b5d-62e7a1e91449","Type":"ContainerDied","Data":"8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784"} Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.015116 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rh7jr" event={"ID":"438c69f1-41f6-4739-9b5d-62e7a1e91449","Type":"ContainerDied","Data":"6f34ad001cb012ab46dd539f6aacaf589f0c7022e3cdd1b5324fcb37e337d2e7"} Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.015113 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rh7jr" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.015151 4725 scope.go:117] "RemoveContainer" containerID="8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.033951 4725 scope.go:117] "RemoveContainer" containerID="81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.038835 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-catalog-content\") pod \"438c69f1-41f6-4739-9b5d-62e7a1e91449\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.038932 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-utilities\") pod \"438c69f1-41f6-4739-9b5d-62e7a1e91449\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.038980 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v7w6\" (UniqueName: \"kubernetes.io/projected/438c69f1-41f6-4739-9b5d-62e7a1e91449-kube-api-access-8v7w6\") pod \"438c69f1-41f6-4739-9b5d-62e7a1e91449\" (UID: \"438c69f1-41f6-4739-9b5d-62e7a1e91449\") " Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.040413 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-utilities" (OuterVolumeSpecName: "utilities") pod "438c69f1-41f6-4739-9b5d-62e7a1e91449" (UID: "438c69f1-41f6-4739-9b5d-62e7a1e91449"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.045126 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438c69f1-41f6-4739-9b5d-62e7a1e91449-kube-api-access-8v7w6" (OuterVolumeSpecName: "kube-api-access-8v7w6") pod "438c69f1-41f6-4739-9b5d-62e7a1e91449" (UID: "438c69f1-41f6-4739-9b5d-62e7a1e91449"). InnerVolumeSpecName "kube-api-access-8v7w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.055174 4725 scope.go:117] "RemoveContainer" containerID="336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.064232 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "438c69f1-41f6-4739-9b5d-62e7a1e91449" (UID: "438c69f1-41f6-4739-9b5d-62e7a1e91449"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.141474 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.141505 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/438c69f1-41f6-4739-9b5d-62e7a1e91449-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.141514 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v7w6\" (UniqueName: \"kubernetes.io/projected/438c69f1-41f6-4739-9b5d-62e7a1e91449-kube-api-access-8v7w6\") on node \"crc\" DevicePath \"\"" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.158875 4725 scope.go:117] "RemoveContainer" containerID="8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784" Feb 27 07:27:16 crc kubenswrapper[4725]: E0227 07:27:16.159207 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784\": container with ID starting with 8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784 not found: ID does not exist" containerID="8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.159238 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784"} err="failed to get container status \"8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784\": rpc error: code = NotFound desc = could not find container \"8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784\": container with ID starting with 8c0c9874d9c24bd0beed01402a81288403595de82e0d02a0f609e656aaeb2784 not found: ID does not exist" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.159258 4725 scope.go:117] "RemoveContainer" containerID="81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f" Feb 27 07:27:16 crc kubenswrapper[4725]: E0227 07:27:16.159525 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f\": container with ID starting with 81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f not found: ID does not exist" containerID="81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.159550 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f"} err="failed to get container status \"81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f\": rpc error: code = NotFound desc = could not find container \"81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f\": container with ID starting with 81cdc1d862b86bcbc918e27a4f9e5e2e30171a0ccf9fbec6a8133fdb6d15b96f not found: ID does not exist" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.159562 4725 scope.go:117] "RemoveContainer" containerID="336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314" Feb 27 07:27:16 crc kubenswrapper[4725]: E0227 07:27:16.159750 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314\": container with ID starting with 336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314 not found: ID does not exist" containerID="336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.159767 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314"} err="failed to get container status \"336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314\": rpc error: code = NotFound desc = could not find container \"336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314\": container with ID starting with 336f29d3f1983163075a53a2abc2b7a081299500eb68a62d5308afc8be8ba314 not found: ID does not exist" Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.360408 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rh7jr"] Feb 27 07:27:16 crc kubenswrapper[4725]: I0227 07:27:16.373636 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rh7jr"] Feb 27 07:27:18 crc kubenswrapper[4725]: I0227 07:27:18.265152 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" path="/var/lib/kubelet/pods/438c69f1-41f6-4739-9b5d-62e7a1e91449/volumes" Feb 27 07:27:26 crc kubenswrapper[4725]: I0227 07:27:26.252038 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:27:26 crc kubenswrapper[4725]: E0227 07:27:26.252890 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:27:37 crc kubenswrapper[4725]: I0227 07:27:37.252484 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:27:37 crc kubenswrapper[4725]: E0227 07:27:37.253511 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:27:51 crc kubenswrapper[4725]: I0227 07:27:51.252664 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:27:51 crc kubenswrapper[4725]: E0227 07:27:51.253745 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.919254 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5jfbz"] Feb 27 07:27:57 crc kubenswrapper[4725]: E0227 07:27:57.920132 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="registry-server" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.920151 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="registry-server" Feb 27 07:27:57 crc kubenswrapper[4725]: E0227 07:27:57.920184 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="extract-utilities" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.920193 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="extract-utilities" Feb 27 07:27:57 crc kubenswrapper[4725]: E0227 07:27:57.920219 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="extract-content" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.920225 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="extract-content" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.920465 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="438c69f1-41f6-4739-9b5d-62e7a1e91449" containerName="registry-server" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.921972 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.941591 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5jfbz"] Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.955014 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp44t\" (UniqueName: \"kubernetes.io/projected/978a8e66-38ed-419c-894e-4ee896c0e773-kube-api-access-bp44t\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.955183 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-utilities\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:57 crc kubenswrapper[4725]: I0227 07:27:57.955657 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-catalog-content\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.058708 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp44t\" (UniqueName: \"kubernetes.io/projected/978a8e66-38ed-419c-894e-4ee896c0e773-kube-api-access-bp44t\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.058852 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-utilities\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.059042 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-catalog-content\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.059434 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-utilities\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.059491 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-catalog-content\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.089439 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp44t\" (UniqueName: \"kubernetes.io/projected/978a8e66-38ed-419c-894e-4ee896c0e773-kube-api-access-bp44t\") pod \"redhat-operators-5jfbz\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.253110 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:27:58 crc kubenswrapper[4725]: I0227 07:27:58.757383 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5jfbz"] Feb 27 07:27:59 crc kubenswrapper[4725]: I0227 07:27:59.561667 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5jfbz" event={"ID":"978a8e66-38ed-419c-894e-4ee896c0e773","Type":"ContainerStarted","Data":"cca616aa18ea6b4d3d3bd9a82be49429fd4a7a50ad479a9628cc2077c0b10667"} Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.161254 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536288-6txh6"] Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.163976 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536288-6txh6" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.169240 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.170741 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.171811 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536288-6txh6"] Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.175724 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.213223 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2pp\" (UniqueName: \"kubernetes.io/projected/ff77c1ac-c67c-41d0-aa66-202ae0c908c7-kube-api-access-pt2pp\") pod \"auto-csr-approver-29536288-6txh6\" (UID: \"ff77c1ac-c67c-41d0-aa66-202ae0c908c7\") " pod="openshift-infra/auto-csr-approver-29536288-6txh6" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.314804 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2pp\" (UniqueName: \"kubernetes.io/projected/ff77c1ac-c67c-41d0-aa66-202ae0c908c7-kube-api-access-pt2pp\") pod \"auto-csr-approver-29536288-6txh6\" (UID: \"ff77c1ac-c67c-41d0-aa66-202ae0c908c7\") " pod="openshift-infra/auto-csr-approver-29536288-6txh6" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.337737 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2pp\" (UniqueName: \"kubernetes.io/projected/ff77c1ac-c67c-41d0-aa66-202ae0c908c7-kube-api-access-pt2pp\") pod \"auto-csr-approver-29536288-6txh6\" (UID: \"ff77c1ac-c67c-41d0-aa66-202ae0c908c7\") " pod="openshift-infra/auto-csr-approver-29536288-6txh6" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.483662 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536288-6txh6" Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.575869 4725 generic.go:334] "Generic (PLEG): container finished" podID="978a8e66-38ed-419c-894e-4ee896c0e773" containerID="8a20b5e968405228d3f34f2cc6d4a27f8cc6fd2dee299fb7c3abe92dbf2fb0bb" exitCode=0 Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.576180 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5jfbz" event={"ID":"978a8e66-38ed-419c-894e-4ee896c0e773","Type":"ContainerDied","Data":"8a20b5e968405228d3f34f2cc6d4a27f8cc6fd2dee299fb7c3abe92dbf2fb0bb"} Feb 27 07:28:00 crc kubenswrapper[4725]: I0227 07:28:00.964817 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536288-6txh6"] Feb 27 07:28:01 crc kubenswrapper[4725]: I0227 07:28:01.587031 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536288-6txh6" event={"ID":"ff77c1ac-c67c-41d0-aa66-202ae0c908c7","Type":"ContainerStarted","Data":"575c9507f509ae6cb67eb12ba40614f2e9e3a874d2ba079d9b58890db63b41ed"} Feb 27 07:28:02 crc kubenswrapper[4725]: I0227 07:28:02.600397 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5jfbz" event={"ID":"978a8e66-38ed-419c-894e-4ee896c0e773","Type":"ContainerStarted","Data":"9a461c4c095d2158ffd0fd534d7f41dd129c602c957606d7e2976aa9c996a7f7"} Feb 27 07:28:02 crc kubenswrapper[4725]: I0227 07:28:02.604486 4725 generic.go:334] "Generic (PLEG): container finished" podID="ff77c1ac-c67c-41d0-aa66-202ae0c908c7" containerID="812befb8061ab8ded8cef1d0771ae2c22d3cea3cd0ac56a3b216a77815b01089" exitCode=0 Feb 27 07:28:02 crc kubenswrapper[4725]: I0227 07:28:02.604542 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536288-6txh6" event={"ID":"ff77c1ac-c67c-41d0-aa66-202ae0c908c7","Type":"ContainerDied","Data":"812befb8061ab8ded8cef1d0771ae2c22d3cea3cd0ac56a3b216a77815b01089"} Feb 27 07:28:03 crc kubenswrapper[4725]: I0227 07:28:03.980167 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536288-6txh6" Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.099902 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt2pp\" (UniqueName: \"kubernetes.io/projected/ff77c1ac-c67c-41d0-aa66-202ae0c908c7-kube-api-access-pt2pp\") pod \"ff77c1ac-c67c-41d0-aa66-202ae0c908c7\" (UID: \"ff77c1ac-c67c-41d0-aa66-202ae0c908c7\") " Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.118249 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff77c1ac-c67c-41d0-aa66-202ae0c908c7-kube-api-access-pt2pp" (OuterVolumeSpecName: "kube-api-access-pt2pp") pod "ff77c1ac-c67c-41d0-aa66-202ae0c908c7" (UID: "ff77c1ac-c67c-41d0-aa66-202ae0c908c7"). InnerVolumeSpecName "kube-api-access-pt2pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.202327 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt2pp\" (UniqueName: \"kubernetes.io/projected/ff77c1ac-c67c-41d0-aa66-202ae0c908c7-kube-api-access-pt2pp\") on node \"crc\" DevicePath \"\"" Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.252071 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.624856 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536288-6txh6" Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.624983 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536288-6txh6" event={"ID":"ff77c1ac-c67c-41d0-aa66-202ae0c908c7","Type":"ContainerDied","Data":"575c9507f509ae6cb67eb12ba40614f2e9e3a874d2ba079d9b58890db63b41ed"} Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.625410 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="575c9507f509ae6cb67eb12ba40614f2e9e3a874d2ba079d9b58890db63b41ed" Feb 27 07:28:04 crc kubenswrapper[4725]: I0227 07:28:04.628910 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"69ea844c312754b29501509e300248e6ff1abc52218ff8ee949cb2b87e93a292"} Feb 27 07:28:05 crc kubenswrapper[4725]: I0227 07:28:05.058143 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536282-8x8ns"] Feb 27 07:28:05 crc kubenswrapper[4725]: I0227 07:28:05.070621 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536282-8x8ns"] Feb 27 07:28:06 crc kubenswrapper[4725]: I0227 07:28:06.269157 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9e4130-3836-4f21-a186-ce361727c44a" path="/var/lib/kubelet/pods/ce9e4130-3836-4f21-a186-ce361727c44a/volumes" Feb 27 07:28:06 crc kubenswrapper[4725]: I0227 07:28:06.650042 4725 generic.go:334] "Generic (PLEG): container finished" podID="978a8e66-38ed-419c-894e-4ee896c0e773" containerID="9a461c4c095d2158ffd0fd534d7f41dd129c602c957606d7e2976aa9c996a7f7" exitCode=0 Feb 27 07:28:06 crc kubenswrapper[4725]: I0227 07:28:06.650140 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5jfbz" event={"ID":"978a8e66-38ed-419c-894e-4ee896c0e773","Type":"ContainerDied","Data":"9a461c4c095d2158ffd0fd534d7f41dd129c602c957606d7e2976aa9c996a7f7"} Feb 27 07:28:07 crc kubenswrapper[4725]: I0227 07:28:07.497539 4725 scope.go:117] "RemoveContainer" containerID="01c2a059ccc79fc6d086ed88bf49e382019a15a3d57b9f3e929ea6dc98be162e" Feb 27 07:28:07 crc kubenswrapper[4725]: I0227 07:28:07.662301 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5jfbz" event={"ID":"978a8e66-38ed-419c-894e-4ee896c0e773","Type":"ContainerStarted","Data":"ded9a1ddca2fbad208e466c5f968fb29592921048cd1ed95d8eed7d16d60fcd4"} Feb 27 07:28:07 crc kubenswrapper[4725]: I0227 07:28:07.683576 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5jfbz" podStartSLOduration=4.209495713 podStartE2EDuration="10.6835587s" podCreationTimestamp="2026-02-27 07:27:57 +0000 UTC" firstStartedPulling="2026-02-27 07:28:00.57779012 +0000 UTC m=+4659.040410679" lastFinishedPulling="2026-02-27 07:28:07.051853097 +0000 UTC m=+4665.514473666" observedRunningTime="2026-02-27 07:28:07.683016065 +0000 UTC m=+4666.145636664" watchObservedRunningTime="2026-02-27 07:28:07.6835587 +0000 UTC m=+4666.146179269" Feb 27 07:28:08 crc kubenswrapper[4725]: I0227 07:28:08.268442 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:28:08 crc kubenswrapper[4725]: I0227 07:28:08.268526 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:28:09 crc kubenswrapper[4725]: I0227 07:28:09.313832 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5jfbz" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="registry-server" probeResult="failure" output=< Feb 27 07:28:09 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:28:09 crc kubenswrapper[4725]: > Feb 27 07:28:19 crc kubenswrapper[4725]: I0227 07:28:19.320955 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5jfbz" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="registry-server" probeResult="failure" output=< Feb 27 07:28:19 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:28:19 crc kubenswrapper[4725]: > Feb 27 07:28:29 crc kubenswrapper[4725]: I0227 07:28:29.313549 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5jfbz" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="registry-server" probeResult="failure" output=< Feb 27 07:28:29 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:28:29 crc kubenswrapper[4725]: > Feb 27 07:28:38 crc kubenswrapper[4725]: I0227 07:28:38.324975 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:28:38 crc kubenswrapper[4725]: I0227 07:28:38.383358 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:28:38 crc kubenswrapper[4725]: I0227 07:28:38.569415 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5jfbz"] Feb 27 07:28:39 crc kubenswrapper[4725]: I0227 07:28:39.981054 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5jfbz" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="registry-server" containerID="cri-o://ded9a1ddca2fbad208e466c5f968fb29592921048cd1ed95d8eed7d16d60fcd4" gracePeriod=2 Feb 27 07:28:40 crc kubenswrapper[4725]: I0227 07:28:40.994274 4725 generic.go:334] "Generic (PLEG): container finished" podID="978a8e66-38ed-419c-894e-4ee896c0e773" containerID="ded9a1ddca2fbad208e466c5f968fb29592921048cd1ed95d8eed7d16d60fcd4" exitCode=0 Feb 27 07:28:40 crc kubenswrapper[4725]: I0227 07:28:40.994350 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5jfbz" event={"ID":"978a8e66-38ed-419c-894e-4ee896c0e773","Type":"ContainerDied","Data":"ded9a1ddca2fbad208e466c5f968fb29592921048cd1ed95d8eed7d16d60fcd4"} Feb 27 07:28:40 crc kubenswrapper[4725]: I0227 07:28:40.994492 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5jfbz" event={"ID":"978a8e66-38ed-419c-894e-4ee896c0e773","Type":"ContainerDied","Data":"cca616aa18ea6b4d3d3bd9a82be49429fd4a7a50ad479a9628cc2077c0b10667"} Feb 27 07:28:40 crc kubenswrapper[4725]: I0227 07:28:40.994512 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca616aa18ea6b4d3d3bd9a82be49429fd4a7a50ad479a9628cc2077c0b10667" Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.071258 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.240209 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp44t\" (UniqueName: \"kubernetes.io/projected/978a8e66-38ed-419c-894e-4ee896c0e773-kube-api-access-bp44t\") pod \"978a8e66-38ed-419c-894e-4ee896c0e773\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.240661 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-utilities\") pod \"978a8e66-38ed-419c-894e-4ee896c0e773\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.240745 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-catalog-content\") pod \"978a8e66-38ed-419c-894e-4ee896c0e773\" (UID: \"978a8e66-38ed-419c-894e-4ee896c0e773\") " Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.241370 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-utilities" (OuterVolumeSpecName: "utilities") pod "978a8e66-38ed-419c-894e-4ee896c0e773" (UID: "978a8e66-38ed-419c-894e-4ee896c0e773"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.241778 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.268541 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978a8e66-38ed-419c-894e-4ee896c0e773-kube-api-access-bp44t" (OuterVolumeSpecName: "kube-api-access-bp44t") pod "978a8e66-38ed-419c-894e-4ee896c0e773" (UID: "978a8e66-38ed-419c-894e-4ee896c0e773"). InnerVolumeSpecName "kube-api-access-bp44t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.343620 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp44t\" (UniqueName: \"kubernetes.io/projected/978a8e66-38ed-419c-894e-4ee896c0e773-kube-api-access-bp44t\") on node \"crc\" DevicePath \"\"" Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.357624 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "978a8e66-38ed-419c-894e-4ee896c0e773" (UID: "978a8e66-38ed-419c-894e-4ee896c0e773"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:28:41 crc kubenswrapper[4725]: I0227 07:28:41.445716 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/978a8e66-38ed-419c-894e-4ee896c0e773-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:28:42 crc kubenswrapper[4725]: I0227 07:28:42.003815 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5jfbz" Feb 27 07:28:42 crc kubenswrapper[4725]: I0227 07:28:42.041792 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5jfbz"] Feb 27 07:28:42 crc kubenswrapper[4725]: I0227 07:28:42.050507 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5jfbz"] Feb 27 07:28:42 crc kubenswrapper[4725]: I0227 07:28:42.273757 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" path="/var/lib/kubelet/pods/978a8e66-38ed-419c-894e-4ee896c0e773/volumes" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.870702 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2fxnj"] Feb 27 07:29:56 crc kubenswrapper[4725]: E0227 07:29:56.872577 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="extract-utilities" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.872613 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="extract-utilities" Feb 27 07:29:56 crc kubenswrapper[4725]: E0227 07:29:56.872689 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="registry-server" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.872712 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="registry-server" Feb 27 07:29:56 crc kubenswrapper[4725]: E0227 07:29:56.872767 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="extract-content" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.872787 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="extract-content" Feb 27 07:29:56 crc kubenswrapper[4725]: E0227 07:29:56.872851 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff77c1ac-c67c-41d0-aa66-202ae0c908c7" containerName="oc" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.872871 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff77c1ac-c67c-41d0-aa66-202ae0c908c7" containerName="oc" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.873476 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff77c1ac-c67c-41d0-aa66-202ae0c908c7" containerName="oc" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.873558 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="978a8e66-38ed-419c-894e-4ee896c0e773" containerName="registry-server" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.877403 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:56 crc kubenswrapper[4725]: I0227 07:29:56.887015 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fxnj"] Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.022754 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-utilities\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.022817 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-catalog-content\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.022882 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgzcd\" (UniqueName: \"kubernetes.io/projected/cf3ce3de-38bf-497f-8f71-cd955be4271a-kube-api-access-sgzcd\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.124397 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-utilities\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.124470 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-catalog-content\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.124560 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgzcd\" (UniqueName: \"kubernetes.io/projected/cf3ce3de-38bf-497f-8f71-cd955be4271a-kube-api-access-sgzcd\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.124864 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-utilities\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.124918 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-catalog-content\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.146116 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgzcd\" (UniqueName: \"kubernetes.io/projected/cf3ce3de-38bf-497f-8f71-cd955be4271a-kube-api-access-sgzcd\") pod \"community-operators-2fxnj\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.224043 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:29:57 crc kubenswrapper[4725]: I0227 07:29:57.793207 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fxnj"] Feb 27 07:29:58 crc kubenswrapper[4725]: I0227 07:29:58.798867 4725 generic.go:334] "Generic (PLEG): container finished" podID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerID="0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87" exitCode=0 Feb 27 07:29:58 crc kubenswrapper[4725]: I0227 07:29:58.798980 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fxnj" event={"ID":"cf3ce3de-38bf-497f-8f71-cd955be4271a","Type":"ContainerDied","Data":"0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87"} Feb 27 07:29:58 crc kubenswrapper[4725]: I0227 07:29:58.799152 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fxnj" event={"ID":"cf3ce3de-38bf-497f-8f71-cd955be4271a","Type":"ContainerStarted","Data":"fb451dcad2710131176e1b87f737873b808e667964a36c1da3be3370e4b60953"} Feb 27 07:29:58 crc kubenswrapper[4725]: I0227 07:29:58.802452 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:29:59 crc kubenswrapper[4725]: I0227 07:29:59.810478 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fxnj" event={"ID":"cf3ce3de-38bf-497f-8f71-cd955be4271a","Type":"ContainerStarted","Data":"9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411"} Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.149970 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536290-bd2z6"] Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.151684 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536290-bd2z6" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.153843 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.155517 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.155709 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.163219 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536290-bd2z6"] Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.282399 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk"] Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.283497 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk"] Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.283575 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.285818 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.287068 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.325603 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg92\" (UniqueName: \"kubernetes.io/projected/2c3b27e1-7a3d-45de-8c43-acce6899aa85-kube-api-access-cwg92\") pod \"auto-csr-approver-29536290-bd2z6\" (UID: \"2c3b27e1-7a3d-45de-8c43-acce6899aa85\") " pod="openshift-infra/auto-csr-approver-29536290-bd2z6" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.427526 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwg92\" (UniqueName: \"kubernetes.io/projected/2c3b27e1-7a3d-45de-8c43-acce6899aa85-kube-api-access-cwg92\") pod \"auto-csr-approver-29536290-bd2z6\" (UID: \"2c3b27e1-7a3d-45de-8c43-acce6899aa85\") " pod="openshift-infra/auto-csr-approver-29536290-bd2z6" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.427623 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-secret-volume\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.427700 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-config-volume\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.427737 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-kube-api-access-79c4h\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.455015 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwg92\" (UniqueName: \"kubernetes.io/projected/2c3b27e1-7a3d-45de-8c43-acce6899aa85-kube-api-access-cwg92\") pod \"auto-csr-approver-29536290-bd2z6\" (UID: \"2c3b27e1-7a3d-45de-8c43-acce6899aa85\") " pod="openshift-infra/auto-csr-approver-29536290-bd2z6" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.471193 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536290-bd2z6" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.530684 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-secret-volume\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.530783 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-config-volume\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.530829 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-kube-api-access-79c4h\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.535805 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-config-volume\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.536142 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-secret-volume\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.561965 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-kube-api-access-79c4h\") pod \"collect-profiles-29536290-9z2vk\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:00 crc kubenswrapper[4725]: I0227 07:30:00.605598 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:01 crc kubenswrapper[4725]: I0227 07:30:01.599866 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536290-bd2z6"] Feb 27 07:30:01 crc kubenswrapper[4725]: W0227 07:30:01.612523 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c3b27e1_7a3d_45de_8c43_acce6899aa85.slice/crio-4150f7d5b3d79977ea63ac7f3a91f7f6a86c6a29a05cddf4cb96f63a9162fe6e WatchSource:0}: Error finding container 4150f7d5b3d79977ea63ac7f3a91f7f6a86c6a29a05cddf4cb96f63a9162fe6e: Status 404 returned error can't find the container with id 4150f7d5b3d79977ea63ac7f3a91f7f6a86c6a29a05cddf4cb96f63a9162fe6e Feb 27 07:30:01 crc kubenswrapper[4725]: I0227 07:30:01.694178 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk"] Feb 27 07:30:02 crc kubenswrapper[4725]: I0227 07:30:02.237302 4725 generic.go:334] "Generic (PLEG): container finished" podID="7187c015-1b5e-4d3d-a7b1-0c6c59a7db10" containerID="f7ec065423bb2ece67c8ab23514f17cb7b644dfdb8707ab706e2f0c7abc4cc8f" exitCode=0 Feb 27 07:30:02 crc kubenswrapper[4725]: I0227 07:30:02.237592 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" event={"ID":"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10","Type":"ContainerDied","Data":"f7ec065423bb2ece67c8ab23514f17cb7b644dfdb8707ab706e2f0c7abc4cc8f"} Feb 27 07:30:02 crc kubenswrapper[4725]: I0227 07:30:02.237619 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" event={"ID":"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10","Type":"ContainerStarted","Data":"01abdc77920ecaaffa30680db88cc31094a7f80adbfc80c037a6c89e3a5d004e"} Feb 27 07:30:02 crc kubenswrapper[4725]: I0227 07:30:02.240074 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536290-bd2z6" event={"ID":"2c3b27e1-7a3d-45de-8c43-acce6899aa85","Type":"ContainerStarted","Data":"4150f7d5b3d79977ea63ac7f3a91f7f6a86c6a29a05cddf4cb96f63a9162fe6e"} Feb 27 07:30:02 crc kubenswrapper[4725]: I0227 07:30:02.243342 4725 generic.go:334] "Generic (PLEG): container finished" podID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerID="9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411" exitCode=0 Feb 27 07:30:02 crc kubenswrapper[4725]: I0227 07:30:02.243384 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fxnj" event={"ID":"cf3ce3de-38bf-497f-8f71-cd955be4271a","Type":"ContainerDied","Data":"9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411"} Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.633858 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.802408 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-kube-api-access-79c4h\") pod \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.802531 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-secret-volume\") pod \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.802652 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-config-volume\") pod \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\" (UID: \"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10\") " Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.804255 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-config-volume" (OuterVolumeSpecName: "config-volume") pod "7187c015-1b5e-4d3d-a7b1-0c6c59a7db10" (UID: "7187c015-1b5e-4d3d-a7b1-0c6c59a7db10"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.808473 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-kube-api-access-79c4h" (OuterVolumeSpecName: "kube-api-access-79c4h") pod "7187c015-1b5e-4d3d-a7b1-0c6c59a7db10" (UID: "7187c015-1b5e-4d3d-a7b1-0c6c59a7db10"). InnerVolumeSpecName "kube-api-access-79c4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.815354 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7187c015-1b5e-4d3d-a7b1-0c6c59a7db10" (UID: "7187c015-1b5e-4d3d-a7b1-0c6c59a7db10"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.904939 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.904977 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:03 crc kubenswrapper[4725]: I0227 07:30:03.904990 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79c4h\" (UniqueName: \"kubernetes.io/projected/7187c015-1b5e-4d3d-a7b1-0c6c59a7db10-kube-api-access-79c4h\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.238276 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8hxnl"] Feb 27 07:30:04 crc kubenswrapper[4725]: E0227 07:30:04.239041 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7187c015-1b5e-4d3d-a7b1-0c6c59a7db10" containerName="collect-profiles" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.239055 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="7187c015-1b5e-4d3d-a7b1-0c6c59a7db10" containerName="collect-profiles" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.239300 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="7187c015-1b5e-4d3d-a7b1-0c6c59a7db10" containerName="collect-profiles" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.241503 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.272745 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hxnl"] Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.273526 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" event={"ID":"7187c015-1b5e-4d3d-a7b1-0c6c59a7db10","Type":"ContainerDied","Data":"01abdc77920ecaaffa30680db88cc31094a7f80adbfc80c037a6c89e3a5d004e"} Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.273552 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01abdc77920ecaaffa30680db88cc31094a7f80adbfc80c037a6c89e3a5d004e" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.273596 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536290-9z2vk" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.295851 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fxnj" event={"ID":"cf3ce3de-38bf-497f-8f71-cd955be4271a","Type":"ContainerStarted","Data":"216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e"} Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.318010 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-utilities\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.318159 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-catalog-content\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.318317 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6bw\" (UniqueName: \"kubernetes.io/projected/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-kube-api-access-qk6bw\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.345708 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2fxnj" podStartSLOduration=4.400631144 podStartE2EDuration="8.345681392s" podCreationTimestamp="2026-02-27 07:29:56 +0000 UTC" firstStartedPulling="2026-02-27 07:29:58.800995112 +0000 UTC m=+4777.263615681" lastFinishedPulling="2026-02-27 07:30:02.74604536 +0000 UTC m=+4781.208665929" observedRunningTime="2026-02-27 07:30:04.317148384 +0000 UTC m=+4782.779768963" watchObservedRunningTime="2026-02-27 07:30:04.345681392 +0000 UTC m=+4782.808301961" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.419155 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-utilities\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.419238 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-catalog-content\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.419339 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6bw\" (UniqueName: \"kubernetes.io/projected/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-kube-api-access-qk6bw\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.419969 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-utilities\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.420021 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-catalog-content\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.448465 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6bw\" (UniqueName: \"kubernetes.io/projected/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-kube-api-access-qk6bw\") pod \"certified-operators-8hxnl\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.578956 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.777828 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz"] Feb 27 07:30:04 crc kubenswrapper[4725]: I0227 07:30:04.788416 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536245-b96nz"] Feb 27 07:30:05 crc kubenswrapper[4725]: I0227 07:30:05.314071 4725 generic.go:334] "Generic (PLEG): container finished" podID="2c3b27e1-7a3d-45de-8c43-acce6899aa85" containerID="287765fe15232e396b3a39f723943c4dad0b4091f71242e2b286c28a4298e12c" exitCode=0 Feb 27 07:30:05 crc kubenswrapper[4725]: I0227 07:30:05.314403 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536290-bd2z6" event={"ID":"2c3b27e1-7a3d-45de-8c43-acce6899aa85","Type":"ContainerDied","Data":"287765fe15232e396b3a39f723943c4dad0b4091f71242e2b286c28a4298e12c"} Feb 27 07:30:05 crc kubenswrapper[4725]: I0227 07:30:05.622894 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hxnl"] Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.272220 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a61a62-28d7-42be-b58b-1d98821caefb" path="/var/lib/kubelet/pods/a3a61a62-28d7-42be-b58b-1d98821caefb/volumes" Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.332759 4725 generic.go:334] "Generic (PLEG): container finished" podID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerID="ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8" exitCode=0 Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.332876 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hxnl" event={"ID":"6831c54e-3bee-4dd8-8b52-0260ec3a9b86","Type":"ContainerDied","Data":"ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8"} Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.333228 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hxnl" event={"ID":"6831c54e-3bee-4dd8-8b52-0260ec3a9b86","Type":"ContainerStarted","Data":"d8098dff825aaf902d22fca860fff39c58899e28db08cabeb5a2138561ee8974"} Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.698403 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536290-bd2z6" Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.803071 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwg92\" (UniqueName: \"kubernetes.io/projected/2c3b27e1-7a3d-45de-8c43-acce6899aa85-kube-api-access-cwg92\") pod \"2c3b27e1-7a3d-45de-8c43-acce6899aa85\" (UID: \"2c3b27e1-7a3d-45de-8c43-acce6899aa85\") " Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.813899 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3b27e1-7a3d-45de-8c43-acce6899aa85-kube-api-access-cwg92" (OuterVolumeSpecName: "kube-api-access-cwg92") pod "2c3b27e1-7a3d-45de-8c43-acce6899aa85" (UID: "2c3b27e1-7a3d-45de-8c43-acce6899aa85"). InnerVolumeSpecName "kube-api-access-cwg92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:30:06 crc kubenswrapper[4725]: I0227 07:30:06.905557 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwg92\" (UniqueName: \"kubernetes.io/projected/2c3b27e1-7a3d-45de-8c43-acce6899aa85-kube-api-access-cwg92\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.224873 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.224950 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.287056 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.346374 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536290-bd2z6" Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.348263 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536290-bd2z6" event={"ID":"2c3b27e1-7a3d-45de-8c43-acce6899aa85","Type":"ContainerDied","Data":"4150f7d5b3d79977ea63ac7f3a91f7f6a86c6a29a05cddf4cb96f63a9162fe6e"} Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.348324 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4150f7d5b3d79977ea63ac7f3a91f7f6a86c6a29a05cddf4cb96f63a9162fe6e" Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.619115 4725 scope.go:117] "RemoveContainer" containerID="f5dbf9bd4c75f25e78483cf73a97a75a735db58753202f06a2085e5990b85ba0" Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.768993 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536284-fkszf"] Feb 27 07:30:07 crc kubenswrapper[4725]: I0227 07:30:07.781533 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536284-fkszf"] Feb 27 07:30:08 crc kubenswrapper[4725]: I0227 07:30:08.275059 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac7f261-80f3-4075-9f99-069ba18a12ff" path="/var/lib/kubelet/pods/4ac7f261-80f3-4075-9f99-069ba18a12ff/volumes" Feb 27 07:30:08 crc kubenswrapper[4725]: I0227 07:30:08.361247 4725 generic.go:334] "Generic (PLEG): container finished" podID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerID="ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632" exitCode=0 Feb 27 07:30:08 crc kubenswrapper[4725]: I0227 07:30:08.361377 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hxnl" event={"ID":"6831c54e-3bee-4dd8-8b52-0260ec3a9b86","Type":"ContainerDied","Data":"ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632"} Feb 27 07:30:08 crc kubenswrapper[4725]: E0227 07:30:08.406581 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6831c54e_3bee_4dd8_8b52_0260ec3a9b86.slice/crio-ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632.scope\": RecentStats: unable to find data in memory cache]" Feb 27 07:30:09 crc kubenswrapper[4725]: I0227 07:30:09.373687 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hxnl" event={"ID":"6831c54e-3bee-4dd8-8b52-0260ec3a9b86","Type":"ContainerStarted","Data":"0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f"} Feb 27 07:30:09 crc kubenswrapper[4725]: I0227 07:30:09.398508 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8hxnl" podStartSLOduration=2.903860124 podStartE2EDuration="5.398483248s" podCreationTimestamp="2026-02-27 07:30:04 +0000 UTC" firstStartedPulling="2026-02-27 07:30:06.338230003 +0000 UTC m=+4784.800850602" lastFinishedPulling="2026-02-27 07:30:08.832853147 +0000 UTC m=+4787.295473726" observedRunningTime="2026-02-27 07:30:09.390141562 +0000 UTC m=+4787.852762161" watchObservedRunningTime="2026-02-27 07:30:09.398483248 +0000 UTC m=+4787.861103837" Feb 27 07:30:14 crc kubenswrapper[4725]: I0227 07:30:14.579500 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:14 crc kubenswrapper[4725]: I0227 07:30:14.580074 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:15 crc kubenswrapper[4725]: I0227 07:30:15.131892 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:15 crc kubenswrapper[4725]: I0227 07:30:15.485571 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:15 crc kubenswrapper[4725]: I0227 07:30:15.533117 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hxnl"] Feb 27 07:30:17 crc kubenswrapper[4725]: I0227 07:30:17.420718 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:30:17 crc kubenswrapper[4725]: I0227 07:30:17.453704 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8hxnl" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="registry-server" containerID="cri-o://0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f" gracePeriod=2 Feb 27 07:30:17 crc kubenswrapper[4725]: I0227 07:30:17.771359 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fxnj"] Feb 27 07:30:17 crc kubenswrapper[4725]: I0227 07:30:17.771845 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2fxnj" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="registry-server" containerID="cri-o://216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e" gracePeriod=2 Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.071030 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.218334 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.261072 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-catalog-content\") pod \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.261504 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk6bw\" (UniqueName: \"kubernetes.io/projected/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-kube-api-access-qk6bw\") pod \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.261535 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-utilities\") pod \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\" (UID: \"6831c54e-3bee-4dd8-8b52-0260ec3a9b86\") " Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.262484 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-utilities" (OuterVolumeSpecName: "utilities") pod "6831c54e-3bee-4dd8-8b52-0260ec3a9b86" (UID: "6831c54e-3bee-4dd8-8b52-0260ec3a9b86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.268820 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-kube-api-access-qk6bw" (OuterVolumeSpecName: "kube-api-access-qk6bw") pod "6831c54e-3bee-4dd8-8b52-0260ec3a9b86" (UID: "6831c54e-3bee-4dd8-8b52-0260ec3a9b86"). InnerVolumeSpecName "kube-api-access-qk6bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.320130 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6831c54e-3bee-4dd8-8b52-0260ec3a9b86" (UID: "6831c54e-3bee-4dd8-8b52-0260ec3a9b86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.363531 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-catalog-content\") pod \"cf3ce3de-38bf-497f-8f71-cd955be4271a\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.363618 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgzcd\" (UniqueName: \"kubernetes.io/projected/cf3ce3de-38bf-497f-8f71-cd955be4271a-kube-api-access-sgzcd\") pod \"cf3ce3de-38bf-497f-8f71-cd955be4271a\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.363786 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-utilities\") pod \"cf3ce3de-38bf-497f-8f71-cd955be4271a\" (UID: \"cf3ce3de-38bf-497f-8f71-cd955be4271a\") " Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.364214 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk6bw\" (UniqueName: \"kubernetes.io/projected/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-kube-api-access-qk6bw\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.364230 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.364239 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6831c54e-3bee-4dd8-8b52-0260ec3a9b86-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.364579 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-utilities" (OuterVolumeSpecName: "utilities") pod "cf3ce3de-38bf-497f-8f71-cd955be4271a" (UID: "cf3ce3de-38bf-497f-8f71-cd955be4271a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.366880 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3ce3de-38bf-497f-8f71-cd955be4271a-kube-api-access-sgzcd" (OuterVolumeSpecName: "kube-api-access-sgzcd") pod "cf3ce3de-38bf-497f-8f71-cd955be4271a" (UID: "cf3ce3de-38bf-497f-8f71-cd955be4271a"). InnerVolumeSpecName "kube-api-access-sgzcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.419011 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf3ce3de-38bf-497f-8f71-cd955be4271a" (UID: "cf3ce3de-38bf-497f-8f71-cd955be4271a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.466242 4725 generic.go:334] "Generic (PLEG): container finished" podID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerID="0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f" exitCode=0 Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.466325 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hxnl" event={"ID":"6831c54e-3bee-4dd8-8b52-0260ec3a9b86","Type":"ContainerDied","Data":"0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f"} Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.466352 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hxnl" event={"ID":"6831c54e-3bee-4dd8-8b52-0260ec3a9b86","Type":"ContainerDied","Data":"d8098dff825aaf902d22fca860fff39c58899e28db08cabeb5a2138561ee8974"} Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.466368 4725 scope.go:117] "RemoveContainer" containerID="0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.466488 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hxnl" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.469008 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.469144 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgzcd\" (UniqueName: \"kubernetes.io/projected/cf3ce3de-38bf-497f-8f71-cd955be4271a-kube-api-access-sgzcd\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.469160 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3ce3de-38bf-497f-8f71-cd955be4271a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.471004 4725 generic.go:334] "Generic (PLEG): container finished" podID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerID="216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e" exitCode=0 Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.471081 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fxnj" event={"ID":"cf3ce3de-38bf-497f-8f71-cd955be4271a","Type":"ContainerDied","Data":"216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e"} Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.471124 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fxnj" event={"ID":"cf3ce3de-38bf-497f-8f71-cd955be4271a","Type":"ContainerDied","Data":"fb451dcad2710131176e1b87f737873b808e667964a36c1da3be3370e4b60953"} Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.471220 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fxnj" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.509318 4725 scope.go:117] "RemoveContainer" containerID="ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.522687 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hxnl"] Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.532022 4725 scope.go:117] "RemoveContainer" containerID="ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.535721 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8hxnl"] Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.544231 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fxnj"] Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.552257 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2fxnj"] Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.553604 4725 scope.go:117] "RemoveContainer" containerID="0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f" Feb 27 07:30:18 crc kubenswrapper[4725]: E0227 07:30:18.553995 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f\": container with ID starting with 0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f not found: ID does not exist" containerID="0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.554030 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f"} err="failed to get container status \"0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f\": rpc error: code = NotFound desc = could not find container \"0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f\": container with ID starting with 0f35626538ae1f98f224ed2d5425abc67c441e0bd95f7ffb0096ce27cb93db9f not found: ID does not exist" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.554058 4725 scope.go:117] "RemoveContainer" containerID="ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632" Feb 27 07:30:18 crc kubenswrapper[4725]: E0227 07:30:18.554394 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632\": container with ID starting with ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632 not found: ID does not exist" containerID="ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.554440 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632"} err="failed to get container status \"ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632\": rpc error: code = NotFound desc = could not find container \"ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632\": container with ID starting with ab51ae32af5f002a5cab9e81cf1dc22d4aa69f6353d2dd1cf0dfe8f3f7964632 not found: ID does not exist" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.554471 4725 scope.go:117] "RemoveContainer" containerID="ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8" Feb 27 07:30:18 crc kubenswrapper[4725]: E0227 07:30:18.554907 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8\": container with ID starting with ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8 not found: ID does not exist" containerID="ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.554937 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8"} err="failed to get container status \"ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8\": rpc error: code = NotFound desc = could not find container \"ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8\": container with ID starting with ed76261c9cca3c2227d4e069978385ef615356e0f69d99f9d31beaa880b8ddf8 not found: ID does not exist" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.554956 4725 scope.go:117] "RemoveContainer" containerID="216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.589871 4725 scope.go:117] "RemoveContainer" containerID="9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.645652 4725 scope.go:117] "RemoveContainer" containerID="0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.704333 4725 scope.go:117] "RemoveContainer" containerID="216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e" Feb 27 07:30:18 crc kubenswrapper[4725]: E0227 07:30:18.705704 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e\": container with ID starting with 216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e not found: ID does not exist" containerID="216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.705747 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e"} err="failed to get container status \"216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e\": rpc error: code = NotFound desc = could not find container \"216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e\": container with ID starting with 216d27fe4bd102eb6b3917aaf949d23754444102d318a0f8ac378c1a1604dc6e not found: ID does not exist" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.705778 4725 scope.go:117] "RemoveContainer" containerID="9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411" Feb 27 07:30:18 crc kubenswrapper[4725]: E0227 07:30:18.706087 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411\": container with ID starting with 9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411 not found: ID does not exist" containerID="9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.706130 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411"} err="failed to get container status \"9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411\": rpc error: code = NotFound desc = could not find container \"9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411\": container with ID starting with 9155834546ee6640e29980c4c3377d8c3f5031ca9eeadbc32b3f7edf88a85411 not found: ID does not exist" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.706156 4725 scope.go:117] "RemoveContainer" containerID="0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87" Feb 27 07:30:18 crc kubenswrapper[4725]: E0227 07:30:18.706448 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87\": container with ID starting with 0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87 not found: ID does not exist" containerID="0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87" Feb 27 07:30:18 crc kubenswrapper[4725]: I0227 07:30:18.706489 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87"} err="failed to get container status \"0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87\": rpc error: code = NotFound desc = could not find container \"0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87\": container with ID starting with 0cf85a5f986769b480ef9890bcb14fd3edecde2256c1fe125c0b378ba9037f87 not found: ID does not exist" Feb 27 07:30:18 crc kubenswrapper[4725]: E0227 07:30:18.764696 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3ce3de_38bf_497f_8f71_cd955be4271a.slice/crio-fb451dcad2710131176e1b87f737873b808e667964a36c1da3be3370e4b60953\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3ce3de_38bf_497f_8f71_cd955be4271a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6831c54e_3bee_4dd8_8b52_0260ec3a9b86.slice/crio-d8098dff825aaf902d22fca860fff39c58899e28db08cabeb5a2138561ee8974\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6831c54e_3bee_4dd8_8b52_0260ec3a9b86.slice\": RecentStats: unable to find data in memory cache]" Feb 27 07:30:20 crc kubenswrapper[4725]: I0227 07:30:20.269764 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" path="/var/lib/kubelet/pods/6831c54e-3bee-4dd8-8b52-0260ec3a9b86/volumes" Feb 27 07:30:20 crc kubenswrapper[4725]: I0227 07:30:20.270917 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" path="/var/lib/kubelet/pods/cf3ce3de-38bf-497f-8f71-cd955be4271a/volumes" Feb 27 07:30:32 crc kubenswrapper[4725]: I0227 07:30:32.554750 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:30:32 crc kubenswrapper[4725]: I0227 07:30:32.555273 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:31:02 crc kubenswrapper[4725]: I0227 07:31:02.553957 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:31:02 crc kubenswrapper[4725]: I0227 07:31:02.554593 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:31:07 crc kubenswrapper[4725]: I0227 07:31:07.690835 4725 scope.go:117] "RemoveContainer" containerID="9318b5132efeae727ae09e482eda2b315ab4e68cc5704f6de3f2c162198a76d6" Feb 27 07:31:32 crc kubenswrapper[4725]: I0227 07:31:32.639783 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:31:32 crc kubenswrapper[4725]: I0227 07:31:32.640546 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:31:32 crc kubenswrapper[4725]: I0227 07:31:32.640614 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:31:32 crc kubenswrapper[4725]: I0227 07:31:32.641810 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69ea844c312754b29501509e300248e6ff1abc52218ff8ee949cb2b87e93a292"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:31:32 crc kubenswrapper[4725]: I0227 07:31:32.641916 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://69ea844c312754b29501509e300248e6ff1abc52218ff8ee949cb2b87e93a292" gracePeriod=600 Feb 27 07:31:33 crc kubenswrapper[4725]: I0227 07:31:33.260232 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="69ea844c312754b29501509e300248e6ff1abc52218ff8ee949cb2b87e93a292" exitCode=0 Feb 27 07:31:33 crc kubenswrapper[4725]: I0227 07:31:33.260337 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"69ea844c312754b29501509e300248e6ff1abc52218ff8ee949cb2b87e93a292"} Feb 27 07:31:33 crc kubenswrapper[4725]: I0227 07:31:33.260624 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c"} Feb 27 07:31:33 crc kubenswrapper[4725]: I0227 07:31:33.260648 4725 scope.go:117] "RemoveContainer" containerID="4b9ed2d04f08bebb632e70c57e469aa58185510a84e6d989931d03397c396c51" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.170860 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536292-bxchs"] Feb 27 07:32:00 crc kubenswrapper[4725]: E0227 07:32:00.172013 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="registry-server" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172029 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="registry-server" Feb 27 07:32:00 crc kubenswrapper[4725]: E0227 07:32:00.172038 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="extract-utilities" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172045 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="extract-utilities" Feb 27 07:32:00 crc kubenswrapper[4725]: E0227 07:32:00.172072 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="extract-content" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172081 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="extract-content" Feb 27 07:32:00 crc kubenswrapper[4725]: E0227 07:32:00.172105 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="extract-content" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172112 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="extract-content" Feb 27 07:32:00 crc kubenswrapper[4725]: E0227 07:32:00.172120 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="registry-server" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172126 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="registry-server" Feb 27 07:32:00 crc kubenswrapper[4725]: E0227 07:32:00.172137 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3b27e1-7a3d-45de-8c43-acce6899aa85" containerName="oc" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172143 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3b27e1-7a3d-45de-8c43-acce6899aa85" containerName="oc" Feb 27 07:32:00 crc kubenswrapper[4725]: E0227 07:32:00.172167 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="extract-utilities" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172175 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="extract-utilities" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172418 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3ce3de-38bf-497f-8f71-cd955be4271a" containerName="registry-server" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172447 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3b27e1-7a3d-45de-8c43-acce6899aa85" containerName="oc" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.172469 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6831c54e-3bee-4dd8-8b52-0260ec3a9b86" containerName="registry-server" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.173347 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536292-bxchs" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.179351 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.179622 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.179765 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.189795 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536292-bxchs"] Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.267874 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjm2s\" (UniqueName: \"kubernetes.io/projected/dbd62ee4-4541-4661-85d8-98eb01b9f5a9-kube-api-access-fjm2s\") pod \"auto-csr-approver-29536292-bxchs\" (UID: \"dbd62ee4-4541-4661-85d8-98eb01b9f5a9\") " pod="openshift-infra/auto-csr-approver-29536292-bxchs" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.370266 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjm2s\" (UniqueName: \"kubernetes.io/projected/dbd62ee4-4541-4661-85d8-98eb01b9f5a9-kube-api-access-fjm2s\") pod \"auto-csr-approver-29536292-bxchs\" (UID: \"dbd62ee4-4541-4661-85d8-98eb01b9f5a9\") " pod="openshift-infra/auto-csr-approver-29536292-bxchs" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.399760 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjm2s\" (UniqueName: \"kubernetes.io/projected/dbd62ee4-4541-4661-85d8-98eb01b9f5a9-kube-api-access-fjm2s\") pod \"auto-csr-approver-29536292-bxchs\" (UID: \"dbd62ee4-4541-4661-85d8-98eb01b9f5a9\") " pod="openshift-infra/auto-csr-approver-29536292-bxchs" Feb 27 07:32:00 crc kubenswrapper[4725]: I0227 07:32:00.497726 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536292-bxchs" Feb 27 07:32:01 crc kubenswrapper[4725]: I0227 07:32:01.059067 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536292-bxchs"] Feb 27 07:32:01 crc kubenswrapper[4725]: I0227 07:32:01.573463 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536292-bxchs" event={"ID":"dbd62ee4-4541-4661-85d8-98eb01b9f5a9","Type":"ContainerStarted","Data":"87273d10e3a2348cb82afc12d59547917fb5daf2fd34667253a9817fc5aa4ab4"} Feb 27 07:32:02 crc kubenswrapper[4725]: I0227 07:32:02.585074 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536292-bxchs" event={"ID":"dbd62ee4-4541-4661-85d8-98eb01b9f5a9","Type":"ContainerStarted","Data":"0e9822a331a88a9892d41ec4602433ba2a579c82d4810fc5a291f96183c3d8a4"} Feb 27 07:32:02 crc kubenswrapper[4725]: I0227 07:32:02.603744 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536292-bxchs" podStartSLOduration=1.468667542 podStartE2EDuration="2.603719434s" podCreationTimestamp="2026-02-27 07:32:00 +0000 UTC" firstStartedPulling="2026-02-27 07:32:01.053006098 +0000 UTC m=+4899.515626677" lastFinishedPulling="2026-02-27 07:32:02.188058 +0000 UTC m=+4900.650678569" observedRunningTime="2026-02-27 07:32:02.596946503 +0000 UTC m=+4901.059567082" watchObservedRunningTime="2026-02-27 07:32:02.603719434 +0000 UTC m=+4901.066340003" Feb 27 07:32:03 crc kubenswrapper[4725]: I0227 07:32:03.597621 4725 generic.go:334] "Generic (PLEG): container finished" podID="dbd62ee4-4541-4661-85d8-98eb01b9f5a9" containerID="0e9822a331a88a9892d41ec4602433ba2a579c82d4810fc5a291f96183c3d8a4" exitCode=0 Feb 27 07:32:03 crc kubenswrapper[4725]: I0227 07:32:03.597720 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536292-bxchs" event={"ID":"dbd62ee4-4541-4661-85d8-98eb01b9f5a9","Type":"ContainerDied","Data":"0e9822a331a88a9892d41ec4602433ba2a579c82d4810fc5a291f96183c3d8a4"} Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.034018 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536292-bxchs" Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.074536 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjm2s\" (UniqueName: \"kubernetes.io/projected/dbd62ee4-4541-4661-85d8-98eb01b9f5a9-kube-api-access-fjm2s\") pod \"dbd62ee4-4541-4661-85d8-98eb01b9f5a9\" (UID: \"dbd62ee4-4541-4661-85d8-98eb01b9f5a9\") " Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.080900 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd62ee4-4541-4661-85d8-98eb01b9f5a9-kube-api-access-fjm2s" (OuterVolumeSpecName: "kube-api-access-fjm2s") pod "dbd62ee4-4541-4661-85d8-98eb01b9f5a9" (UID: "dbd62ee4-4541-4661-85d8-98eb01b9f5a9"). InnerVolumeSpecName "kube-api-access-fjm2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.177239 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjm2s\" (UniqueName: \"kubernetes.io/projected/dbd62ee4-4541-4661-85d8-98eb01b9f5a9-kube-api-access-fjm2s\") on node \"crc\" DevicePath \"\"" Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.370089 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536286-99n9t"] Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.384989 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536286-99n9t"] Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.619430 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536292-bxchs" event={"ID":"dbd62ee4-4541-4661-85d8-98eb01b9f5a9","Type":"ContainerDied","Data":"87273d10e3a2348cb82afc12d59547917fb5daf2fd34667253a9817fc5aa4ab4"} Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.619464 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87273d10e3a2348cb82afc12d59547917fb5daf2fd34667253a9817fc5aa4ab4" Feb 27 07:32:05 crc kubenswrapper[4725]: I0227 07:32:05.619473 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536292-bxchs" Feb 27 07:32:06 crc kubenswrapper[4725]: I0227 07:32:06.270678 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7fd961-7665-421b-8680-f8468e642650" path="/var/lib/kubelet/pods/8c7fd961-7665-421b-8680-f8468e642650/volumes" Feb 27 07:32:07 crc kubenswrapper[4725]: I0227 07:32:07.818986 4725 scope.go:117] "RemoveContainer" containerID="3d117b165f8da445983d5fd3862a89f2ecb9fd2df77869e8e74eeec127b2a6f5" Feb 27 07:33:32 crc kubenswrapper[4725]: I0227 07:33:32.555063 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:33:32 crc kubenswrapper[4725]: I0227 07:33:32.555987 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.153023 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536294-wbzfs"] Feb 27 07:34:00 crc kubenswrapper[4725]: E0227 07:34:00.154502 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd62ee4-4541-4661-85d8-98eb01b9f5a9" containerName="oc" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.154524 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd62ee4-4541-4661-85d8-98eb01b9f5a9" containerName="oc" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.154862 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd62ee4-4541-4661-85d8-98eb01b9f5a9" containerName="oc" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.155990 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536294-wbzfs" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.158795 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.158795 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.159978 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.169788 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536294-wbzfs"] Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.229460 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdgp\" (UniqueName: \"kubernetes.io/projected/9eb0d7de-b864-4371-9879-af65e1f95869-kube-api-access-2cdgp\") pod \"auto-csr-approver-29536294-wbzfs\" (UID: \"9eb0d7de-b864-4371-9879-af65e1f95869\") " pod="openshift-infra/auto-csr-approver-29536294-wbzfs" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.332541 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdgp\" (UniqueName: \"kubernetes.io/projected/9eb0d7de-b864-4371-9879-af65e1f95869-kube-api-access-2cdgp\") pod \"auto-csr-approver-29536294-wbzfs\" (UID: \"9eb0d7de-b864-4371-9879-af65e1f95869\") " pod="openshift-infra/auto-csr-approver-29536294-wbzfs" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.357152 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdgp\" (UniqueName: \"kubernetes.io/projected/9eb0d7de-b864-4371-9879-af65e1f95869-kube-api-access-2cdgp\") pod \"auto-csr-approver-29536294-wbzfs\" (UID: \"9eb0d7de-b864-4371-9879-af65e1f95869\") " pod="openshift-infra/auto-csr-approver-29536294-wbzfs" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.483280 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536294-wbzfs" Feb 27 07:34:00 crc kubenswrapper[4725]: I0227 07:34:00.955817 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536294-wbzfs"] Feb 27 07:34:01 crc kubenswrapper[4725]: I0227 07:34:01.849211 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536294-wbzfs" event={"ID":"9eb0d7de-b864-4371-9879-af65e1f95869","Type":"ContainerStarted","Data":"451af3837eafbac8545051c7061979c3d494131bfa0ecbfe74d8ee3b2d516560"} Feb 27 07:34:02 crc kubenswrapper[4725]: I0227 07:34:02.554265 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:34:02 crc kubenswrapper[4725]: I0227 07:34:02.554913 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:34:02 crc kubenswrapper[4725]: I0227 07:34:02.864937 4725 generic.go:334] "Generic (PLEG): container finished" podID="9eb0d7de-b864-4371-9879-af65e1f95869" containerID="a9e5139c2602c83ac94d6247b279eea1d00483f1e388c243fe98fa457f42f779" exitCode=0 Feb 27 07:34:02 crc kubenswrapper[4725]: I0227 07:34:02.866385 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536294-wbzfs" event={"ID":"9eb0d7de-b864-4371-9879-af65e1f95869","Type":"ContainerDied","Data":"a9e5139c2602c83ac94d6247b279eea1d00483f1e388c243fe98fa457f42f779"} Feb 27 07:34:04 crc kubenswrapper[4725]: I0227 07:34:04.271360 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536294-wbzfs" Feb 27 07:34:04 crc kubenswrapper[4725]: I0227 07:34:04.422183 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdgp\" (UniqueName: \"kubernetes.io/projected/9eb0d7de-b864-4371-9879-af65e1f95869-kube-api-access-2cdgp\") pod \"9eb0d7de-b864-4371-9879-af65e1f95869\" (UID: \"9eb0d7de-b864-4371-9879-af65e1f95869\") " Feb 27 07:34:04 crc kubenswrapper[4725]: I0227 07:34:04.428911 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb0d7de-b864-4371-9879-af65e1f95869-kube-api-access-2cdgp" (OuterVolumeSpecName: "kube-api-access-2cdgp") pod "9eb0d7de-b864-4371-9879-af65e1f95869" (UID: "9eb0d7de-b864-4371-9879-af65e1f95869"). InnerVolumeSpecName "kube-api-access-2cdgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:34:04 crc kubenswrapper[4725]: I0227 07:34:04.525510 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdgp\" (UniqueName: \"kubernetes.io/projected/9eb0d7de-b864-4371-9879-af65e1f95869-kube-api-access-2cdgp\") on node \"crc\" DevicePath \"\"" Feb 27 07:34:04 crc kubenswrapper[4725]: I0227 07:34:04.883682 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536294-wbzfs" event={"ID":"9eb0d7de-b864-4371-9879-af65e1f95869","Type":"ContainerDied","Data":"451af3837eafbac8545051c7061979c3d494131bfa0ecbfe74d8ee3b2d516560"} Feb 27 07:34:04 crc kubenswrapper[4725]: I0227 07:34:04.883719 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="451af3837eafbac8545051c7061979c3d494131bfa0ecbfe74d8ee3b2d516560" Feb 27 07:34:04 crc kubenswrapper[4725]: I0227 07:34:04.883788 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536294-wbzfs" Feb 27 07:34:05 crc kubenswrapper[4725]: I0227 07:34:05.347732 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536288-6txh6"] Feb 27 07:34:05 crc kubenswrapper[4725]: I0227 07:34:05.359696 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536288-6txh6"] Feb 27 07:34:06 crc kubenswrapper[4725]: I0227 07:34:06.266931 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff77c1ac-c67c-41d0-aa66-202ae0c908c7" path="/var/lib/kubelet/pods/ff77c1ac-c67c-41d0-aa66-202ae0c908c7/volumes" Feb 27 07:34:08 crc kubenswrapper[4725]: I0227 07:34:08.169082 4725 scope.go:117] "RemoveContainer" containerID="9a461c4c095d2158ffd0fd534d7f41dd129c602c957606d7e2976aa9c996a7f7" Feb 27 07:34:08 crc kubenswrapper[4725]: I0227 07:34:08.197278 4725 scope.go:117] "RemoveContainer" containerID="812befb8061ab8ded8cef1d0771ae2c22d3cea3cd0ac56a3b216a77815b01089" Feb 27 07:34:08 crc kubenswrapper[4725]: I0227 07:34:08.308445 4725 scope.go:117] "RemoveContainer" containerID="ded9a1ddca2fbad208e466c5f968fb29592921048cd1ed95d8eed7d16d60fcd4" Feb 27 07:34:08 crc kubenswrapper[4725]: I0227 07:34:08.335583 4725 scope.go:117] "RemoveContainer" containerID="8a20b5e968405228d3f34f2cc6d4a27f8cc6fd2dee299fb7c3abe92dbf2fb0bb" Feb 27 07:34:32 crc kubenswrapper[4725]: I0227 07:34:32.554609 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:34:32 crc kubenswrapper[4725]: I0227 07:34:32.555377 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:34:32 crc kubenswrapper[4725]: I0227 07:34:32.555441 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:34:32 crc kubenswrapper[4725]: I0227 07:34:32.556708 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:34:32 crc kubenswrapper[4725]: I0227 07:34:32.556804 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" gracePeriod=600 Feb 27 07:34:32 crc kubenswrapper[4725]: E0227 07:34:32.692643 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:34:33 crc kubenswrapper[4725]: I0227 07:34:33.189870 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" exitCode=0 Feb 27 07:34:33 crc kubenswrapper[4725]: I0227 07:34:33.189934 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c"} Feb 27 07:34:33 crc kubenswrapper[4725]: I0227 07:34:33.190003 4725 scope.go:117] "RemoveContainer" containerID="69ea844c312754b29501509e300248e6ff1abc52218ff8ee949cb2b87e93a292" Feb 27 07:34:33 crc kubenswrapper[4725]: I0227 07:34:33.193393 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:34:33 crc kubenswrapper[4725]: E0227 07:34:33.194364 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:34:46 crc kubenswrapper[4725]: I0227 07:34:46.252319 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:34:46 crc kubenswrapper[4725]: E0227 07:34:46.254516 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:35:02 crc kubenswrapper[4725]: I0227 07:35:02.270324 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:35:02 crc kubenswrapper[4725]: E0227 07:35:02.271811 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:35:14 crc kubenswrapper[4725]: I0227 07:35:14.252278 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:35:14 crc kubenswrapper[4725]: E0227 07:35:14.253181 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:35:28 crc kubenswrapper[4725]: I0227 07:35:28.251722 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:35:28 crc kubenswrapper[4725]: E0227 07:35:28.252688 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:35:41 crc kubenswrapper[4725]: I0227 07:35:41.253202 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:35:41 crc kubenswrapper[4725]: E0227 07:35:41.254501 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:35:52 crc kubenswrapper[4725]: I0227 07:35:52.260174 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:35:52 crc kubenswrapper[4725]: E0227 07:35:52.261108 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.142267 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536296-9wzr4"] Feb 27 07:36:00 crc kubenswrapper[4725]: E0227 07:36:00.143386 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb0d7de-b864-4371-9879-af65e1f95869" containerName="oc" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.143404 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb0d7de-b864-4371-9879-af65e1f95869" containerName="oc" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.143663 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb0d7de-b864-4371-9879-af65e1f95869" containerName="oc" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.144544 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536296-9wzr4" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.147069 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.147494 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.147740 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.154085 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536296-9wzr4"] Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.261518 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfnnt\" (UniqueName: \"kubernetes.io/projected/d8bb4d0a-b66c-4493-b990-cd23305b481d-kube-api-access-vfnnt\") pod \"auto-csr-approver-29536296-9wzr4\" (UID: \"d8bb4d0a-b66c-4493-b990-cd23305b481d\") " pod="openshift-infra/auto-csr-approver-29536296-9wzr4" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.364200 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfnnt\" (UniqueName: \"kubernetes.io/projected/d8bb4d0a-b66c-4493-b990-cd23305b481d-kube-api-access-vfnnt\") pod \"auto-csr-approver-29536296-9wzr4\" (UID: \"d8bb4d0a-b66c-4493-b990-cd23305b481d\") " pod="openshift-infra/auto-csr-approver-29536296-9wzr4" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.384739 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfnnt\" (UniqueName: \"kubernetes.io/projected/d8bb4d0a-b66c-4493-b990-cd23305b481d-kube-api-access-vfnnt\") pod \"auto-csr-approver-29536296-9wzr4\" (UID: \"d8bb4d0a-b66c-4493-b990-cd23305b481d\") " pod="openshift-infra/auto-csr-approver-29536296-9wzr4" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.465603 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536296-9wzr4" Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.917234 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536296-9wzr4"] Feb 27 07:36:00 crc kubenswrapper[4725]: I0227 07:36:00.925083 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:36:01 crc kubenswrapper[4725]: I0227 07:36:01.112707 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536296-9wzr4" event={"ID":"d8bb4d0a-b66c-4493-b990-cd23305b481d","Type":"ContainerStarted","Data":"b5955b9d055d3f626039f43687543c3fd372e0047cad4a7e9965267af3335c6d"} Feb 27 07:36:03 crc kubenswrapper[4725]: I0227 07:36:03.133249 4725 generic.go:334] "Generic (PLEG): container finished" podID="d8bb4d0a-b66c-4493-b990-cd23305b481d" containerID="435d07fddf7be4b6547ca3411980a39f1dc66b7188b15f072d2d5b671434ec5b" exitCode=0 Feb 27 07:36:03 crc kubenswrapper[4725]: I0227 07:36:03.133348 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536296-9wzr4" event={"ID":"d8bb4d0a-b66c-4493-b990-cd23305b481d","Type":"ContainerDied","Data":"435d07fddf7be4b6547ca3411980a39f1dc66b7188b15f072d2d5b671434ec5b"} Feb 27 07:36:03 crc kubenswrapper[4725]: I0227 07:36:03.251837 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:36:03 crc kubenswrapper[4725]: E0227 07:36:03.252216 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:36:04 crc kubenswrapper[4725]: I0227 07:36:04.505489 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536296-9wzr4" Feb 27 07:36:04 crc kubenswrapper[4725]: I0227 07:36:04.668575 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfnnt\" (UniqueName: \"kubernetes.io/projected/d8bb4d0a-b66c-4493-b990-cd23305b481d-kube-api-access-vfnnt\") pod \"d8bb4d0a-b66c-4493-b990-cd23305b481d\" (UID: \"d8bb4d0a-b66c-4493-b990-cd23305b481d\") " Feb 27 07:36:04 crc kubenswrapper[4725]: I0227 07:36:04.674809 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bb4d0a-b66c-4493-b990-cd23305b481d-kube-api-access-vfnnt" (OuterVolumeSpecName: "kube-api-access-vfnnt") pod "d8bb4d0a-b66c-4493-b990-cd23305b481d" (UID: "d8bb4d0a-b66c-4493-b990-cd23305b481d"). InnerVolumeSpecName "kube-api-access-vfnnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:36:04 crc kubenswrapper[4725]: I0227 07:36:04.772031 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfnnt\" (UniqueName: \"kubernetes.io/projected/d8bb4d0a-b66c-4493-b990-cd23305b481d-kube-api-access-vfnnt\") on node \"crc\" DevicePath \"\"" Feb 27 07:36:05 crc kubenswrapper[4725]: I0227 07:36:05.150346 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536296-9wzr4" event={"ID":"d8bb4d0a-b66c-4493-b990-cd23305b481d","Type":"ContainerDied","Data":"b5955b9d055d3f626039f43687543c3fd372e0047cad4a7e9965267af3335c6d"} Feb 27 07:36:05 crc kubenswrapper[4725]: I0227 07:36:05.150647 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5955b9d055d3f626039f43687543c3fd372e0047cad4a7e9965267af3335c6d" Feb 27 07:36:05 crc kubenswrapper[4725]: I0227 07:36:05.150401 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536296-9wzr4" Feb 27 07:36:05 crc kubenswrapper[4725]: I0227 07:36:05.573273 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536290-bd2z6"] Feb 27 07:36:05 crc kubenswrapper[4725]: I0227 07:36:05.581955 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536290-bd2z6"] Feb 27 07:36:06 crc kubenswrapper[4725]: I0227 07:36:06.273661 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3b27e1-7a3d-45de-8c43-acce6899aa85" path="/var/lib/kubelet/pods/2c3b27e1-7a3d-45de-8c43-acce6899aa85/volumes" Feb 27 07:36:08 crc kubenswrapper[4725]: I0227 07:36:08.470401 4725 scope.go:117] "RemoveContainer" containerID="287765fe15232e396b3a39f723943c4dad0b4091f71242e2b286c28a4298e12c" Feb 27 07:36:16 crc kubenswrapper[4725]: I0227 07:36:16.251264 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:36:16 crc kubenswrapper[4725]: E0227 07:36:16.252023 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:36:31 crc kubenswrapper[4725]: I0227 07:36:31.252932 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:36:31 crc kubenswrapper[4725]: E0227 07:36:31.254228 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:36:42 crc kubenswrapper[4725]: I0227 07:36:42.287846 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:36:42 crc kubenswrapper[4725]: E0227 07:36:42.288754 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:36:53 crc kubenswrapper[4725]: I0227 07:36:53.251225 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:36:53 crc kubenswrapper[4725]: E0227 07:36:53.251938 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:37:05 crc kubenswrapper[4725]: I0227 07:37:05.251757 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:37:05 crc kubenswrapper[4725]: E0227 07:37:05.253142 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:37:16 crc kubenswrapper[4725]: I0227 07:37:16.252630 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:37:16 crc kubenswrapper[4725]: E0227 07:37:16.253997 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.416402 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bvfm7"] Feb 27 07:37:17 crc kubenswrapper[4725]: E0227 07:37:17.417042 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bb4d0a-b66c-4493-b990-cd23305b481d" containerName="oc" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.417065 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bb4d0a-b66c-4493-b990-cd23305b481d" containerName="oc" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.417469 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bb4d0a-b66c-4493-b990-cd23305b481d" containerName="oc" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.423892 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.437456 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvfm7"] Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.589303 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-catalog-content\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.589351 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-utilities\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.589384 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6llfq\" (UniqueName: \"kubernetes.io/projected/b8180246-9f38-4fb8-9b80-fe9aa5169829-kube-api-access-6llfq\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.691357 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-catalog-content\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.691964 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-utilities\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.691888 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-catalog-content\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.692066 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6llfq\" (UniqueName: \"kubernetes.io/projected/b8180246-9f38-4fb8-9b80-fe9aa5169829-kube-api-access-6llfq\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.692392 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-utilities\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.720461 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6llfq\" (UniqueName: \"kubernetes.io/projected/b8180246-9f38-4fb8-9b80-fe9aa5169829-kube-api-access-6llfq\") pod \"redhat-marketplace-bvfm7\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:17 crc kubenswrapper[4725]: I0227 07:37:17.767417 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:18 crc kubenswrapper[4725]: I0227 07:37:18.266634 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvfm7"] Feb 27 07:37:18 crc kubenswrapper[4725]: I0227 07:37:18.882973 4725 generic.go:334] "Generic (PLEG): container finished" podID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerID="e0e0e7d89bf0427f35741574dffa6156ba9ac2576d7e226555870e904e516b2b" exitCode=0 Feb 27 07:37:18 crc kubenswrapper[4725]: I0227 07:37:18.883172 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvfm7" event={"ID":"b8180246-9f38-4fb8-9b80-fe9aa5169829","Type":"ContainerDied","Data":"e0e0e7d89bf0427f35741574dffa6156ba9ac2576d7e226555870e904e516b2b"} Feb 27 07:37:18 crc kubenswrapper[4725]: I0227 07:37:18.883357 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvfm7" event={"ID":"b8180246-9f38-4fb8-9b80-fe9aa5169829","Type":"ContainerStarted","Data":"45915034af25a7d68061446f4026d3e7ac771d3c091b9e40c91fe2b1899e25e8"} Feb 27 07:37:19 crc kubenswrapper[4725]: I0227 07:37:19.894631 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvfm7" event={"ID":"b8180246-9f38-4fb8-9b80-fe9aa5169829","Type":"ContainerStarted","Data":"221cbb6d8e308df80d2a125f9195097eb3ba84c9d6d61e4fd928160fb394989e"} Feb 27 07:37:20 crc kubenswrapper[4725]: I0227 07:37:20.905541 4725 generic.go:334] "Generic (PLEG): container finished" podID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerID="221cbb6d8e308df80d2a125f9195097eb3ba84c9d6d61e4fd928160fb394989e" exitCode=0 Feb 27 07:37:20 crc kubenswrapper[4725]: I0227 07:37:20.905597 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvfm7" event={"ID":"b8180246-9f38-4fb8-9b80-fe9aa5169829","Type":"ContainerDied","Data":"221cbb6d8e308df80d2a125f9195097eb3ba84c9d6d61e4fd928160fb394989e"} Feb 27 07:37:21 crc kubenswrapper[4725]: I0227 07:37:21.916499 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvfm7" event={"ID":"b8180246-9f38-4fb8-9b80-fe9aa5169829","Type":"ContainerStarted","Data":"6250ba6ec01b8a44e14dc51d3a0e1eead618d6d39aba49fe59bee033be5a2b37"} Feb 27 07:37:21 crc kubenswrapper[4725]: I0227 07:37:21.939729 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bvfm7" podStartSLOduration=2.297733944 podStartE2EDuration="4.939709646s" podCreationTimestamp="2026-02-27 07:37:17 +0000 UTC" firstStartedPulling="2026-02-27 07:37:18.884874348 +0000 UTC m=+5217.347494917" lastFinishedPulling="2026-02-27 07:37:21.52685005 +0000 UTC m=+5219.989470619" observedRunningTime="2026-02-27 07:37:21.934051666 +0000 UTC m=+5220.396672245" watchObservedRunningTime="2026-02-27 07:37:21.939709646 +0000 UTC m=+5220.402330225" Feb 27 07:37:27 crc kubenswrapper[4725]: I0227 07:37:27.768352 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:27 crc kubenswrapper[4725]: I0227 07:37:27.768925 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:27 crc kubenswrapper[4725]: I0227 07:37:27.813051 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:28 crc kubenswrapper[4725]: I0227 07:37:28.039000 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:28 crc kubenswrapper[4725]: I0227 07:37:28.110309 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvfm7"] Feb 27 07:37:30 crc kubenswrapper[4725]: I0227 07:37:30.000573 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bvfm7" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="registry-server" containerID="cri-o://6250ba6ec01b8a44e14dc51d3a0e1eead618d6d39aba49fe59bee033be5a2b37" gracePeriod=2 Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.010950 4725 generic.go:334] "Generic (PLEG): container finished" podID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerID="6250ba6ec01b8a44e14dc51d3a0e1eead618d6d39aba49fe59bee033be5a2b37" exitCode=0 Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.011518 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvfm7" event={"ID":"b8180246-9f38-4fb8-9b80-fe9aa5169829","Type":"ContainerDied","Data":"6250ba6ec01b8a44e14dc51d3a0e1eead618d6d39aba49fe59bee033be5a2b37"} Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.011548 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvfm7" event={"ID":"b8180246-9f38-4fb8-9b80-fe9aa5169829","Type":"ContainerDied","Data":"45915034af25a7d68061446f4026d3e7ac771d3c091b9e40c91fe2b1899e25e8"} Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.011562 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45915034af25a7d68061446f4026d3e7ac771d3c091b9e40c91fe2b1899e25e8" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.083019 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.192905 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-utilities\") pod \"b8180246-9f38-4fb8-9b80-fe9aa5169829\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.193361 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6llfq\" (UniqueName: \"kubernetes.io/projected/b8180246-9f38-4fb8-9b80-fe9aa5169829-kube-api-access-6llfq\") pod \"b8180246-9f38-4fb8-9b80-fe9aa5169829\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.193489 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-catalog-content\") pod \"b8180246-9f38-4fb8-9b80-fe9aa5169829\" (UID: \"b8180246-9f38-4fb8-9b80-fe9aa5169829\") " Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.193998 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-utilities" (OuterVolumeSpecName: "utilities") pod "b8180246-9f38-4fb8-9b80-fe9aa5169829" (UID: "b8180246-9f38-4fb8-9b80-fe9aa5169829"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.194929 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.200456 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8180246-9f38-4fb8-9b80-fe9aa5169829-kube-api-access-6llfq" (OuterVolumeSpecName: "kube-api-access-6llfq") pod "b8180246-9f38-4fb8-9b80-fe9aa5169829" (UID: "b8180246-9f38-4fb8-9b80-fe9aa5169829"). InnerVolumeSpecName "kube-api-access-6llfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.229694 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8180246-9f38-4fb8-9b80-fe9aa5169829" (UID: "b8180246-9f38-4fb8-9b80-fe9aa5169829"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.252404 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:37:31 crc kubenswrapper[4725]: E0227 07:37:31.252828 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.297360 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6llfq\" (UniqueName: \"kubernetes.io/projected/b8180246-9f38-4fb8-9b80-fe9aa5169829-kube-api-access-6llfq\") on node \"crc\" DevicePath \"\"" Feb 27 07:37:31 crc kubenswrapper[4725]: I0227 07:37:31.297401 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8180246-9f38-4fb8-9b80-fe9aa5169829-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:37:32 crc kubenswrapper[4725]: I0227 07:37:32.018207 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvfm7" Feb 27 07:37:32 crc kubenswrapper[4725]: I0227 07:37:32.066820 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvfm7"] Feb 27 07:37:32 crc kubenswrapper[4725]: I0227 07:37:32.078897 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvfm7"] Feb 27 07:37:32 crc kubenswrapper[4725]: I0227 07:37:32.273517 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" path="/var/lib/kubelet/pods/b8180246-9f38-4fb8-9b80-fe9aa5169829/volumes" Feb 27 07:37:44 crc kubenswrapper[4725]: I0227 07:37:44.251842 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:37:44 crc kubenswrapper[4725]: E0227 07:37:44.252602 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:37:58 crc kubenswrapper[4725]: I0227 07:37:58.251308 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:37:58 crc kubenswrapper[4725]: E0227 07:37:58.252240 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.163783 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536298-tlcwr"] Feb 27 07:38:00 crc kubenswrapper[4725]: E0227 07:38:00.164879 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="registry-server" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.164902 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="registry-server" Feb 27 07:38:00 crc kubenswrapper[4725]: E0227 07:38:00.164955 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="extract-utilities" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.164970 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="extract-utilities" Feb 27 07:38:00 crc kubenswrapper[4725]: E0227 07:38:00.165039 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="extract-content" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.165065 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="extract-content" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.165533 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8180246-9f38-4fb8-9b80-fe9aa5169829" containerName="registry-server" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.166726 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536298-tlcwr" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.170479 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.170580 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.170649 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.175796 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbfv\" (UniqueName: \"kubernetes.io/projected/cdab3cf7-f7ce-4ac1-9935-5bd85b61738c-kube-api-access-cvbfv\") pod \"auto-csr-approver-29536298-tlcwr\" (UID: \"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c\") " pod="openshift-infra/auto-csr-approver-29536298-tlcwr" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.182653 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536298-tlcwr"] Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.280583 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbfv\" (UniqueName: \"kubernetes.io/projected/cdab3cf7-f7ce-4ac1-9935-5bd85b61738c-kube-api-access-cvbfv\") pod \"auto-csr-approver-29536298-tlcwr\" (UID: \"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c\") " pod="openshift-infra/auto-csr-approver-29536298-tlcwr" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.306342 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbfv\" (UniqueName: \"kubernetes.io/projected/cdab3cf7-f7ce-4ac1-9935-5bd85b61738c-kube-api-access-cvbfv\") pod \"auto-csr-approver-29536298-tlcwr\" (UID: \"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c\") " pod="openshift-infra/auto-csr-approver-29536298-tlcwr" Feb 27 07:38:00 crc kubenswrapper[4725]: I0227 07:38:00.491396 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536298-tlcwr" Feb 27 07:38:01 crc kubenswrapper[4725]: W0227 07:38:01.052554 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdab3cf7_f7ce_4ac1_9935_5bd85b61738c.slice/crio-8f98d52c9c37ae5a0105f4cd9b1dd478b3bc6f3da591f9260094a1f68a9ab0db WatchSource:0}: Error finding container 8f98d52c9c37ae5a0105f4cd9b1dd478b3bc6f3da591f9260094a1f68a9ab0db: Status 404 returned error can't find the container with id 8f98d52c9c37ae5a0105f4cd9b1dd478b3bc6f3da591f9260094a1f68a9ab0db Feb 27 07:38:01 crc kubenswrapper[4725]: I0227 07:38:01.057404 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536298-tlcwr"] Feb 27 07:38:01 crc kubenswrapper[4725]: I0227 07:38:01.361156 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536298-tlcwr" event={"ID":"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c","Type":"ContainerStarted","Data":"8f98d52c9c37ae5a0105f4cd9b1dd478b3bc6f3da591f9260094a1f68a9ab0db"} Feb 27 07:38:03 crc kubenswrapper[4725]: I0227 07:38:03.386100 4725 generic.go:334] "Generic (PLEG): container finished" podID="cdab3cf7-f7ce-4ac1-9935-5bd85b61738c" containerID="4595fe44e9ac629797d455a904641679c2e381ecada87c2a9b2aac507d132725" exitCode=0 Feb 27 07:38:03 crc kubenswrapper[4725]: I0227 07:38:03.386210 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536298-tlcwr" event={"ID":"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c","Type":"ContainerDied","Data":"4595fe44e9ac629797d455a904641679c2e381ecada87c2a9b2aac507d132725"} Feb 27 07:38:04 crc kubenswrapper[4725]: I0227 07:38:04.794107 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536298-tlcwr" Feb 27 07:38:04 crc kubenswrapper[4725]: I0227 07:38:04.886855 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbfv\" (UniqueName: \"kubernetes.io/projected/cdab3cf7-f7ce-4ac1-9935-5bd85b61738c-kube-api-access-cvbfv\") pod \"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c\" (UID: \"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c\") " Feb 27 07:38:04 crc kubenswrapper[4725]: I0227 07:38:04.900368 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdab3cf7-f7ce-4ac1-9935-5bd85b61738c-kube-api-access-cvbfv" (OuterVolumeSpecName: "kube-api-access-cvbfv") pod "cdab3cf7-f7ce-4ac1-9935-5bd85b61738c" (UID: "cdab3cf7-f7ce-4ac1-9935-5bd85b61738c"). InnerVolumeSpecName "kube-api-access-cvbfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:38:04 crc kubenswrapper[4725]: I0227 07:38:04.989029 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbfv\" (UniqueName: \"kubernetes.io/projected/cdab3cf7-f7ce-4ac1-9935-5bd85b61738c-kube-api-access-cvbfv\") on node \"crc\" DevicePath \"\"" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.409581 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536298-tlcwr" event={"ID":"cdab3cf7-f7ce-4ac1-9935-5bd85b61738c","Type":"ContainerDied","Data":"8f98d52c9c37ae5a0105f4cd9b1dd478b3bc6f3da591f9260094a1f68a9ab0db"} Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.409614 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f98d52c9c37ae5a0105f4cd9b1dd478b3bc6f3da591f9260094a1f68a9ab0db" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.409626 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536298-tlcwr" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.624746 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6k2sx"] Feb 27 07:38:05 crc kubenswrapper[4725]: E0227 07:38:05.625590 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdab3cf7-f7ce-4ac1-9935-5bd85b61738c" containerName="oc" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.625613 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdab3cf7-f7ce-4ac1-9935-5bd85b61738c" containerName="oc" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.625880 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdab3cf7-f7ce-4ac1-9935-5bd85b61738c" containerName="oc" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.638118 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.659204 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6k2sx"] Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.703349 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-utilities\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.703500 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5x5x\" (UniqueName: \"kubernetes.io/projected/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-kube-api-access-x5x5x\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.703549 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-catalog-content\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.805118 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5x5x\" (UniqueName: \"kubernetes.io/projected/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-kube-api-access-x5x5x\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.805887 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-catalog-content\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.806127 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-utilities\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.806609 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-catalog-content\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.806655 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-utilities\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.826640 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5x5x\" (UniqueName: \"kubernetes.io/projected/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-kube-api-access-x5x5x\") pod \"redhat-operators-6k2sx\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.873927 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536292-bxchs"] Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.892656 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536292-bxchs"] Feb 27 07:38:05 crc kubenswrapper[4725]: I0227 07:38:05.986929 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:06 crc kubenswrapper[4725]: I0227 07:38:06.263493 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd62ee4-4541-4661-85d8-98eb01b9f5a9" path="/var/lib/kubelet/pods/dbd62ee4-4541-4661-85d8-98eb01b9f5a9/volumes" Feb 27 07:38:06 crc kubenswrapper[4725]: I0227 07:38:06.491048 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6k2sx"] Feb 27 07:38:07 crc kubenswrapper[4725]: I0227 07:38:07.428940 4725 generic.go:334] "Generic (PLEG): container finished" podID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerID="c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c" exitCode=0 Feb 27 07:38:07 crc kubenswrapper[4725]: I0227 07:38:07.429041 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6k2sx" event={"ID":"252c250f-e52c-4fbb-ba05-d9e8a3e4d325","Type":"ContainerDied","Data":"c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c"} Feb 27 07:38:07 crc kubenswrapper[4725]: I0227 07:38:07.429227 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6k2sx" event={"ID":"252c250f-e52c-4fbb-ba05-d9e8a3e4d325","Type":"ContainerStarted","Data":"aaeb7938ef7e60761fa4d502c3517a23fff640a65113eaf2f63d0a90830aecfa"} Feb 27 07:38:08 crc kubenswrapper[4725]: I0227 07:38:08.441551 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6k2sx" event={"ID":"252c250f-e52c-4fbb-ba05-d9e8a3e4d325","Type":"ContainerStarted","Data":"98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e"} Feb 27 07:38:08 crc kubenswrapper[4725]: I0227 07:38:08.589119 4725 scope.go:117] "RemoveContainer" containerID="0e9822a331a88a9892d41ec4602433ba2a579c82d4810fc5a291f96183c3d8a4" Feb 27 07:38:11 crc kubenswrapper[4725]: I0227 07:38:11.251593 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:38:11 crc kubenswrapper[4725]: E0227 07:38:11.252628 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:38:13 crc kubenswrapper[4725]: I0227 07:38:13.493366 4725 generic.go:334] "Generic (PLEG): container finished" podID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerID="98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e" exitCode=0 Feb 27 07:38:13 crc kubenswrapper[4725]: I0227 07:38:13.493577 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6k2sx" event={"ID":"252c250f-e52c-4fbb-ba05-d9e8a3e4d325","Type":"ContainerDied","Data":"98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e"} Feb 27 07:38:14 crc kubenswrapper[4725]: I0227 07:38:14.504864 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6k2sx" event={"ID":"252c250f-e52c-4fbb-ba05-d9e8a3e4d325","Type":"ContainerStarted","Data":"63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa"} Feb 27 07:38:14 crc kubenswrapper[4725]: I0227 07:38:14.527827 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6k2sx" podStartSLOduration=3.066236534 podStartE2EDuration="9.52780865s" podCreationTimestamp="2026-02-27 07:38:05 +0000 UTC" firstStartedPulling="2026-02-27 07:38:07.432047664 +0000 UTC m=+5265.894668233" lastFinishedPulling="2026-02-27 07:38:13.89361978 +0000 UTC m=+5272.356240349" observedRunningTime="2026-02-27 07:38:14.521680546 +0000 UTC m=+5272.984301125" watchObservedRunningTime="2026-02-27 07:38:14.52780865 +0000 UTC m=+5272.990429219" Feb 27 07:38:15 crc kubenswrapper[4725]: I0227 07:38:15.987934 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:15 crc kubenswrapper[4725]: I0227 07:38:15.987988 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:17 crc kubenswrapper[4725]: I0227 07:38:17.035677 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6k2sx" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="registry-server" probeResult="failure" output=< Feb 27 07:38:17 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:38:17 crc kubenswrapper[4725]: > Feb 27 07:38:23 crc kubenswrapper[4725]: I0227 07:38:23.251992 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:38:23 crc kubenswrapper[4725]: E0227 07:38:23.252773 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:38:26 crc kubenswrapper[4725]: I0227 07:38:26.035569 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:26 crc kubenswrapper[4725]: I0227 07:38:26.094101 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:26 crc kubenswrapper[4725]: I0227 07:38:26.280121 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6k2sx"] Feb 27 07:38:27 crc kubenswrapper[4725]: I0227 07:38:27.633824 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6k2sx" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="registry-server" containerID="cri-o://63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa" gracePeriod=2 Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.183856 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.289734 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-catalog-content\") pod \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.290013 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-utilities\") pod \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.290098 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5x5x\" (UniqueName: \"kubernetes.io/projected/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-kube-api-access-x5x5x\") pod \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\" (UID: \"252c250f-e52c-4fbb-ba05-d9e8a3e4d325\") " Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.290896 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-utilities" (OuterVolumeSpecName: "utilities") pod "252c250f-e52c-4fbb-ba05-d9e8a3e4d325" (UID: "252c250f-e52c-4fbb-ba05-d9e8a3e4d325"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.296451 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-kube-api-access-x5x5x" (OuterVolumeSpecName: "kube-api-access-x5x5x") pod "252c250f-e52c-4fbb-ba05-d9e8a3e4d325" (UID: "252c250f-e52c-4fbb-ba05-d9e8a3e4d325"). InnerVolumeSpecName "kube-api-access-x5x5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.392663 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.392706 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5x5x\" (UniqueName: \"kubernetes.io/projected/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-kube-api-access-x5x5x\") on node \"crc\" DevicePath \"\"" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.428119 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "252c250f-e52c-4fbb-ba05-d9e8a3e4d325" (UID: "252c250f-e52c-4fbb-ba05-d9e8a3e4d325"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.494413 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/252c250f-e52c-4fbb-ba05-d9e8a3e4d325-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.650533 4725 generic.go:334] "Generic (PLEG): container finished" podID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerID="63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa" exitCode=0 Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.650578 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6k2sx" event={"ID":"252c250f-e52c-4fbb-ba05-d9e8a3e4d325","Type":"ContainerDied","Data":"63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa"} Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.650604 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6k2sx" event={"ID":"252c250f-e52c-4fbb-ba05-d9e8a3e4d325","Type":"ContainerDied","Data":"aaeb7938ef7e60761fa4d502c3517a23fff640a65113eaf2f63d0a90830aecfa"} Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.650619 4725 scope.go:117] "RemoveContainer" containerID="63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.650658 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6k2sx" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.692615 4725 scope.go:117] "RemoveContainer" containerID="98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.717147 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6k2sx"] Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.738732 4725 scope.go:117] "RemoveContainer" containerID="c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.741313 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6k2sx"] Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.780742 4725 scope.go:117] "RemoveContainer" containerID="63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa" Feb 27 07:38:28 crc kubenswrapper[4725]: E0227 07:38:28.781314 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa\": container with ID starting with 63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa not found: ID does not exist" containerID="63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.781367 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa"} err="failed to get container status \"63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa\": rpc error: code = NotFound desc = could not find container \"63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa\": container with ID starting with 63a9b2813b8d083db745340282d534e4129b611e4c5704587c2365554102abaa not found: ID does not exist" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.781394 4725 scope.go:117] "RemoveContainer" containerID="98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e" Feb 27 07:38:28 crc kubenswrapper[4725]: E0227 07:38:28.781880 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e\": container with ID starting with 98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e not found: ID does not exist" containerID="98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.781951 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e"} err="failed to get container status \"98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e\": rpc error: code = NotFound desc = could not find container \"98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e\": container with ID starting with 98ef4f07975b92ce77b54b3a1c8810ad6f5f29df6c754ffaf821d65f1359002e not found: ID does not exist" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.781998 4725 scope.go:117] "RemoveContainer" containerID="c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c" Feb 27 07:38:28 crc kubenswrapper[4725]: E0227 07:38:28.782605 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c\": container with ID starting with c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c not found: ID does not exist" containerID="c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c" Feb 27 07:38:28 crc kubenswrapper[4725]: I0227 07:38:28.782640 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c"} err="failed to get container status \"c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c\": rpc error: code = NotFound desc = could not find container \"c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c\": container with ID starting with c215e2f7ff23199bd50807fa8f8587665f8162ffc446c7d3784af0fe30f2c41c not found: ID does not exist" Feb 27 07:38:30 crc kubenswrapper[4725]: I0227 07:38:30.265446 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" path="/var/lib/kubelet/pods/252c250f-e52c-4fbb-ba05-d9e8a3e4d325/volumes" Feb 27 07:38:36 crc kubenswrapper[4725]: I0227 07:38:36.251929 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:38:36 crc kubenswrapper[4725]: E0227 07:38:36.252662 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:38:47 crc kubenswrapper[4725]: I0227 07:38:47.251499 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:38:47 crc kubenswrapper[4725]: E0227 07:38:47.252213 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:39:00 crc kubenswrapper[4725]: I0227 07:39:00.252494 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:39:00 crc kubenswrapper[4725]: E0227 07:39:00.253711 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:39:12 crc kubenswrapper[4725]: I0227 07:39:12.151980 4725 generic.go:334] "Generic (PLEG): container finished" podID="a4ededce-4af9-418c-af09-c79e79cb044f" containerID="60870ee4b4edbe49c0eccdf1ff1496c815205624b5ab0c6d0e3e4b77cf442e6a" exitCode=0 Feb 27 07:39:12 crc kubenswrapper[4725]: I0227 07:39:12.152100 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4ededce-4af9-418c-af09-c79e79cb044f","Type":"ContainerDied","Data":"60870ee4b4edbe49c0eccdf1ff1496c815205624b5ab0c6d0e3e4b77cf442e6a"} Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.251920 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:39:13 crc kubenswrapper[4725]: E0227 07:39:13.252470 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.574710 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661062 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-temporary\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661139 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661169 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-config-data\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661247 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-workdir\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661367 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6gh7\" (UniqueName: \"kubernetes.io/projected/a4ededce-4af9-418c-af09-c79e79cb044f-kube-api-access-h6gh7\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661414 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ca-certs\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661527 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config-secret\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.661557 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.662092 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ssh-key\") pod \"a4ededce-4af9-418c-af09-c79e79cb044f\" (UID: \"a4ededce-4af9-418c-af09-c79e79cb044f\") " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.668675 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.669321 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.670001 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-config-data" (OuterVolumeSpecName: "config-data") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.672764 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ededce-4af9-418c-af09-c79e79cb044f-kube-api-access-h6gh7" (OuterVolumeSpecName: "kube-api-access-h6gh7") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "kube-api-access-h6gh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.672922 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.706961 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.714326 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.723125 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.738568 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a4ededce-4af9-418c-af09-c79e79cb044f" (UID: "a4ededce-4af9-418c-af09-c79e79cb044f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765128 4725 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765180 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6gh7\" (UniqueName: \"kubernetes.io/projected/a4ededce-4af9-418c-af09-c79e79cb044f-kube-api-access-h6gh7\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765193 4725 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765206 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765245 4725 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765255 4725 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4ededce-4af9-418c-af09-c79e79cb044f-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765265 4725 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a4ededce-4af9-418c-af09-c79e79cb044f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765274 4725 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.765301 4725 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4ededce-4af9-418c-af09-c79e79cb044f-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.789495 4725 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 27 07:39:13 crc kubenswrapper[4725]: I0227 07:39:13.867030 4725 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 27 07:39:14 crc kubenswrapper[4725]: I0227 07:39:14.176197 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a4ededce-4af9-418c-af09-c79e79cb044f","Type":"ContainerDied","Data":"3438e28c42b333e3d9f54615c9e4c5ec3f1e1b10f7e5b0d54f340d9b2e7f8476"} Feb 27 07:39:14 crc kubenswrapper[4725]: I0227 07:39:14.176707 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3438e28c42b333e3d9f54615c9e4c5ec3f1e1b10f7e5b0d54f340d9b2e7f8476" Feb 27 07:39:14 crc kubenswrapper[4725]: I0227 07:39:14.176330 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.323491 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 07:39:21 crc kubenswrapper[4725]: E0227 07:39:21.325324 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ededce-4af9-418c-af09-c79e79cb044f" containerName="tempest-tests-tempest-tests-runner" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.325367 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ededce-4af9-418c-af09-c79e79cb044f" containerName="tempest-tests-tempest-tests-runner" Feb 27 07:39:21 crc kubenswrapper[4725]: E0227 07:39:21.325423 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="extract-content" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.325443 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="extract-content" Feb 27 07:39:21 crc kubenswrapper[4725]: E0227 07:39:21.325477 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="extract-utilities" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.325497 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="extract-utilities" Feb 27 07:39:21 crc kubenswrapper[4725]: E0227 07:39:21.325553 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="registry-server" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.325571 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="registry-server" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.326083 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="252c250f-e52c-4fbb-ba05-d9e8a3e4d325" containerName="registry-server" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.326121 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ededce-4af9-418c-af09-c79e79cb044f" containerName="tempest-tests-tempest-tests-runner" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.327771 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.332048 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p8rhj" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.335129 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.461444 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ef86f5d-c8f3-4077-8184-4aecfa313695\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.461519 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7wx8\" (UniqueName: \"kubernetes.io/projected/1ef86f5d-c8f3-4077-8184-4aecfa313695-kube-api-access-h7wx8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ef86f5d-c8f3-4077-8184-4aecfa313695\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.564116 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ef86f5d-c8f3-4077-8184-4aecfa313695\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.564223 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7wx8\" (UniqueName: \"kubernetes.io/projected/1ef86f5d-c8f3-4077-8184-4aecfa313695-kube-api-access-h7wx8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ef86f5d-c8f3-4077-8184-4aecfa313695\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:21 crc kubenswrapper[4725]: I0227 07:39:21.564719 4725 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ef86f5d-c8f3-4077-8184-4aecfa313695\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:22 crc kubenswrapper[4725]: I0227 07:39:22.295254 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7wx8\" (UniqueName: \"kubernetes.io/projected/1ef86f5d-c8f3-4077-8184-4aecfa313695-kube-api-access-h7wx8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ef86f5d-c8f3-4077-8184-4aecfa313695\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:22 crc kubenswrapper[4725]: I0227 07:39:22.351586 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ef86f5d-c8f3-4077-8184-4aecfa313695\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:22 crc kubenswrapper[4725]: I0227 07:39:22.558077 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p8rhj" Feb 27 07:39:22 crc kubenswrapper[4725]: I0227 07:39:22.565049 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 07:39:23 crc kubenswrapper[4725]: I0227 07:39:23.011877 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 07:39:23 crc kubenswrapper[4725]: I0227 07:39:23.287815 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1ef86f5d-c8f3-4077-8184-4aecfa313695","Type":"ContainerStarted","Data":"2cf91de0833d6434e6976ce752eb80c180c8bf0762fbc03be39897aeb26245de"} Feb 27 07:39:25 crc kubenswrapper[4725]: I0227 07:39:25.308801 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1ef86f5d-c8f3-4077-8184-4aecfa313695","Type":"ContainerStarted","Data":"f16a375f903c9acb888441a50640ec2d6714f1aadb3240a760fa62d20435ee37"} Feb 27 07:39:25 crc kubenswrapper[4725]: I0227 07:39:25.330654 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.689558755 podStartE2EDuration="4.330637244s" podCreationTimestamp="2026-02-27 07:39:21 +0000 UTC" firstStartedPulling="2026-02-27 07:39:23.021255211 +0000 UTC m=+5341.483875780" lastFinishedPulling="2026-02-27 07:39:24.66233369 +0000 UTC m=+5343.124954269" observedRunningTime="2026-02-27 07:39:25.320313972 +0000 UTC m=+5343.782934551" watchObservedRunningTime="2026-02-27 07:39:25.330637244 +0000 UTC m=+5343.793257813" Feb 27 07:39:28 crc kubenswrapper[4725]: I0227 07:39:28.252313 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:39:28 crc kubenswrapper[4725]: E0227 07:39:28.253232 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:39:40 crc kubenswrapper[4725]: I0227 07:39:40.253460 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:39:41 crc kubenswrapper[4725]: I0227 07:39:41.487103 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"3deb9a7f7c3458bea4b2fc28fe6aed78eaa0a969dd24bc29a40c059dd7e94a96"} Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.878390 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hkvvl/must-gather-s9ngw"] Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.880956 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.883335 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hkvvl"/"kube-root-ca.crt" Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.883719 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hkvvl"/"default-dockercfg-jg7vc" Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.883985 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hkvvl"/"openshift-service-ca.crt" Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.893049 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hkvvl/must-gather-s9ngw"] Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.972445 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmsh\" (UniqueName: \"kubernetes.io/projected/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-kube-api-access-ppmsh\") pod \"must-gather-s9ngw\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:47 crc kubenswrapper[4725]: I0227 07:39:47.972667 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-must-gather-output\") pod \"must-gather-s9ngw\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:48 crc kubenswrapper[4725]: I0227 07:39:48.076088 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-must-gather-output\") pod \"must-gather-s9ngw\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:48 crc kubenswrapper[4725]: I0227 07:39:48.076313 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmsh\" (UniqueName: \"kubernetes.io/projected/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-kube-api-access-ppmsh\") pod \"must-gather-s9ngw\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:48 crc kubenswrapper[4725]: I0227 07:39:48.076524 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-must-gather-output\") pod \"must-gather-s9ngw\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:48 crc kubenswrapper[4725]: I0227 07:39:48.277631 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmsh\" (UniqueName: \"kubernetes.io/projected/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-kube-api-access-ppmsh\") pod \"must-gather-s9ngw\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:48 crc kubenswrapper[4725]: I0227 07:39:48.499049 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:39:49 crc kubenswrapper[4725]: I0227 07:39:49.038866 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hkvvl/must-gather-s9ngw"] Feb 27 07:39:49 crc kubenswrapper[4725]: I0227 07:39:49.577649 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" event={"ID":"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f","Type":"ContainerStarted","Data":"0b66957b862f549e9842d69d3129484cf2c723235d289103b4bc5f84a737fb59"} Feb 27 07:39:55 crc kubenswrapper[4725]: I0227 07:39:55.641509 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" event={"ID":"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f","Type":"ContainerStarted","Data":"44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80"} Feb 27 07:39:55 crc kubenswrapper[4725]: I0227 07:39:55.642104 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" event={"ID":"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f","Type":"ContainerStarted","Data":"5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569"} Feb 27 07:39:55 crc kubenswrapper[4725]: I0227 07:39:55.663401 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" podStartSLOduration=2.592402698 podStartE2EDuration="8.663383508s" podCreationTimestamp="2026-02-27 07:39:47 +0000 UTC" firstStartedPulling="2026-02-27 07:39:49.055870137 +0000 UTC m=+5367.518490726" lastFinishedPulling="2026-02-27 07:39:55.126850977 +0000 UTC m=+5373.589471536" observedRunningTime="2026-02-27 07:39:55.653774016 +0000 UTC m=+5374.116394585" watchObservedRunningTime="2026-02-27 07:39:55.663383508 +0000 UTC m=+5374.126004077" Feb 27 07:39:59 crc kubenswrapper[4725]: I0227 07:39:59.803140 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-vmqrc"] Feb 27 07:39:59 crc kubenswrapper[4725]: I0227 07:39:59.805073 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:39:59 crc kubenswrapper[4725]: I0227 07:39:59.973711 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqhj\" (UniqueName: \"kubernetes.io/projected/354c14f9-8a02-4676-b450-aebe2bfe0d52-kube-api-access-zvqhj\") pod \"crc-debug-vmqrc\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:39:59 crc kubenswrapper[4725]: I0227 07:39:59.973927 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/354c14f9-8a02-4676-b450-aebe2bfe0d52-host\") pod \"crc-debug-vmqrc\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.075449 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/354c14f9-8a02-4676-b450-aebe2bfe0d52-host\") pod \"crc-debug-vmqrc\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.075570 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/354c14f9-8a02-4676-b450-aebe2bfe0d52-host\") pod \"crc-debug-vmqrc\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.075657 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqhj\" (UniqueName: \"kubernetes.io/projected/354c14f9-8a02-4676-b450-aebe2bfe0d52-kube-api-access-zvqhj\") pod \"crc-debug-vmqrc\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.108126 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqhj\" (UniqueName: \"kubernetes.io/projected/354c14f9-8a02-4676-b450-aebe2bfe0d52-kube-api-access-zvqhj\") pod \"crc-debug-vmqrc\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.121494 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.166578 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536300-9w7wc"] Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.168388 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.171845 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.172355 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.172588 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.184414 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536300-9w7wc"] Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.289679 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97jbq\" (UniqueName: \"kubernetes.io/projected/16888d09-2b96-4094-af03-7b76e668a81f-kube-api-access-97jbq\") pod \"auto-csr-approver-29536300-9w7wc\" (UID: \"16888d09-2b96-4094-af03-7b76e668a81f\") " pod="openshift-infra/auto-csr-approver-29536300-9w7wc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.391875 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97jbq\" (UniqueName: \"kubernetes.io/projected/16888d09-2b96-4094-af03-7b76e668a81f-kube-api-access-97jbq\") pod \"auto-csr-approver-29536300-9w7wc\" (UID: \"16888d09-2b96-4094-af03-7b76e668a81f\") " pod="openshift-infra/auto-csr-approver-29536300-9w7wc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.413886 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97jbq\" (UniqueName: \"kubernetes.io/projected/16888d09-2b96-4094-af03-7b76e668a81f-kube-api-access-97jbq\") pod \"auto-csr-approver-29536300-9w7wc\" (UID: \"16888d09-2b96-4094-af03-7b76e668a81f\") " pod="openshift-infra/auto-csr-approver-29536300-9w7wc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.587844 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" Feb 27 07:40:00 crc kubenswrapper[4725]: I0227 07:40:00.741396 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" event={"ID":"354c14f9-8a02-4676-b450-aebe2bfe0d52","Type":"ContainerStarted","Data":"314a6367aa87a7df4c66575f18358ac92385c51f7ea3d2a88f0d6e080f3e3d1a"} Feb 27 07:40:01 crc kubenswrapper[4725]: I0227 07:40:01.081619 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536300-9w7wc"] Feb 27 07:40:01 crc kubenswrapper[4725]: I0227 07:40:01.751636 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" event={"ID":"16888d09-2b96-4094-af03-7b76e668a81f","Type":"ContainerStarted","Data":"5422970e6dfdfdd9d88cfb68e5e3ffed1a0926e15686bfc936f32ddcec14ac4c"} Feb 27 07:40:02 crc kubenswrapper[4725]: I0227 07:40:02.761034 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" event={"ID":"16888d09-2b96-4094-af03-7b76e668a81f","Type":"ContainerStarted","Data":"5de1d5f8797056411d4b4d7be26b3efa056409ffb1a8b2e857f058bd1fad05f0"} Feb 27 07:40:02 crc kubenswrapper[4725]: I0227 07:40:02.781339 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" podStartSLOduration=1.460377925 podStartE2EDuration="2.78131461s" podCreationTimestamp="2026-02-27 07:40:00 +0000 UTC" firstStartedPulling="2026-02-27 07:40:01.105979891 +0000 UTC m=+5379.568600460" lastFinishedPulling="2026-02-27 07:40:02.426916576 +0000 UTC m=+5380.889537145" observedRunningTime="2026-02-27 07:40:02.775389782 +0000 UTC m=+5381.238010351" watchObservedRunningTime="2026-02-27 07:40:02.78131461 +0000 UTC m=+5381.243935179" Feb 27 07:40:03 crc kubenswrapper[4725]: I0227 07:40:03.780883 4725 generic.go:334] "Generic (PLEG): container finished" podID="16888d09-2b96-4094-af03-7b76e668a81f" containerID="5de1d5f8797056411d4b4d7be26b3efa056409ffb1a8b2e857f058bd1fad05f0" exitCode=0 Feb 27 07:40:03 crc kubenswrapper[4725]: I0227 07:40:03.781078 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" event={"ID":"16888d09-2b96-4094-af03-7b76e668a81f","Type":"ContainerDied","Data":"5de1d5f8797056411d4b4d7be26b3efa056409ffb1a8b2e857f058bd1fad05f0"} Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.199714 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.335737 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97jbq\" (UniqueName: \"kubernetes.io/projected/16888d09-2b96-4094-af03-7b76e668a81f-kube-api-access-97jbq\") pod \"16888d09-2b96-4094-af03-7b76e668a81f\" (UID: \"16888d09-2b96-4094-af03-7b76e668a81f\") " Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.344577 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16888d09-2b96-4094-af03-7b76e668a81f-kube-api-access-97jbq" (OuterVolumeSpecName: "kube-api-access-97jbq") pod "16888d09-2b96-4094-af03-7b76e668a81f" (UID: "16888d09-2b96-4094-af03-7b76e668a81f"). InnerVolumeSpecName "kube-api-access-97jbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.350924 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536294-wbzfs"] Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.365353 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536294-wbzfs"] Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.438858 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97jbq\" (UniqueName: \"kubernetes.io/projected/16888d09-2b96-4094-af03-7b76e668a81f-kube-api-access-97jbq\") on node \"crc\" DevicePath \"\"" Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.804890 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" event={"ID":"16888d09-2b96-4094-af03-7b76e668a81f","Type":"ContainerDied","Data":"5422970e6dfdfdd9d88cfb68e5e3ffed1a0926e15686bfc936f32ddcec14ac4c"} Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.804947 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5422970e6dfdfdd9d88cfb68e5e3ffed1a0926e15686bfc936f32ddcec14ac4c" Feb 27 07:40:05 crc kubenswrapper[4725]: I0227 07:40:05.805018 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536300-9w7wc" Feb 27 07:40:06 crc kubenswrapper[4725]: I0227 07:40:06.265929 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb0d7de-b864-4371-9879-af65e1f95869" path="/var/lib/kubelet/pods/9eb0d7de-b864-4371-9879-af65e1f95869/volumes" Feb 27 07:40:08 crc kubenswrapper[4725]: I0227 07:40:08.734677 4725 scope.go:117] "RemoveContainer" containerID="a9e5139c2602c83ac94d6247b279eea1d00483f1e388c243fe98fa457f42f779" Feb 27 07:40:12 crc kubenswrapper[4725]: I0227 07:40:12.881000 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" event={"ID":"354c14f9-8a02-4676-b450-aebe2bfe0d52","Type":"ContainerStarted","Data":"cfd2460c2919f48499125689e29c35a5ef4eddeb928cd7f7a299da7ef5d29c2f"} Feb 27 07:40:12 crc kubenswrapper[4725]: I0227 07:40:12.907367 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" podStartSLOduration=1.5015310309999998 podStartE2EDuration="13.907342827s" podCreationTimestamp="2026-02-27 07:39:59 +0000 UTC" firstStartedPulling="2026-02-27 07:40:00.200148956 +0000 UTC m=+5378.662769525" lastFinishedPulling="2026-02-27 07:40:12.605960752 +0000 UTC m=+5391.068581321" observedRunningTime="2026-02-27 07:40:12.901500402 +0000 UTC m=+5391.364120981" watchObservedRunningTime="2026-02-27 07:40:12.907342827 +0000 UTC m=+5391.369963396" Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.810837 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sz6wq"] Feb 27 07:40:47 crc kubenswrapper[4725]: E0227 07:40:47.812022 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16888d09-2b96-4094-af03-7b76e668a81f" containerName="oc" Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.812039 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="16888d09-2b96-4094-af03-7b76e668a81f" containerName="oc" Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.812376 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="16888d09-2b96-4094-af03-7b76e668a81f" containerName="oc" Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.815990 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.824888 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sz6wq"] Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.982770 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-catalog-content\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.982941 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-utilities\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:47 crc kubenswrapper[4725]: I0227 07:40:47.983005 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2phw\" (UniqueName: \"kubernetes.io/projected/ae077687-09f7-489f-8bbc-9a6d5b1babc3-kube-api-access-b2phw\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.085238 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2phw\" (UniqueName: \"kubernetes.io/projected/ae077687-09f7-489f-8bbc-9a6d5b1babc3-kube-api-access-b2phw\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.085415 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-catalog-content\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.085554 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-utilities\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.085955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-catalog-content\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.086002 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-utilities\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.106318 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2phw\" (UniqueName: \"kubernetes.io/projected/ae077687-09f7-489f-8bbc-9a6d5b1babc3-kube-api-access-b2phw\") pod \"community-operators-sz6wq\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.137517 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:48 crc kubenswrapper[4725]: I0227 07:40:48.801048 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sz6wq"] Feb 27 07:40:49 crc kubenswrapper[4725]: I0227 07:40:49.229722 4725 generic.go:334] "Generic (PLEG): container finished" podID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerID="088047ebfa757a009cc4af0eb9ad2e20845e2c9f874a550e03249426ae06a910" exitCode=0 Feb 27 07:40:49 crc kubenswrapper[4725]: I0227 07:40:49.229817 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz6wq" event={"ID":"ae077687-09f7-489f-8bbc-9a6d5b1babc3","Type":"ContainerDied","Data":"088047ebfa757a009cc4af0eb9ad2e20845e2c9f874a550e03249426ae06a910"} Feb 27 07:40:49 crc kubenswrapper[4725]: I0227 07:40:49.229944 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz6wq" event={"ID":"ae077687-09f7-489f-8bbc-9a6d5b1babc3","Type":"ContainerStarted","Data":"35cd40c271e31a460a3d0f472c3a7467693970e00ddc47a54bbe942b205b46b7"} Feb 27 07:40:51 crc kubenswrapper[4725]: I0227 07:40:51.277583 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz6wq" event={"ID":"ae077687-09f7-489f-8bbc-9a6d5b1babc3","Type":"ContainerStarted","Data":"2e1bf9edc18a5ccac82995ab5e711c9ce1024188384d8d3e104e7f0912a2e099"} Feb 27 07:40:53 crc kubenswrapper[4725]: I0227 07:40:53.295868 4725 generic.go:334] "Generic (PLEG): container finished" podID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerID="2e1bf9edc18a5ccac82995ab5e711c9ce1024188384d8d3e104e7f0912a2e099" exitCode=0 Feb 27 07:40:53 crc kubenswrapper[4725]: I0227 07:40:53.295972 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz6wq" event={"ID":"ae077687-09f7-489f-8bbc-9a6d5b1babc3","Type":"ContainerDied","Data":"2e1bf9edc18a5ccac82995ab5e711c9ce1024188384d8d3e104e7f0912a2e099"} Feb 27 07:40:54 crc kubenswrapper[4725]: I0227 07:40:54.306707 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz6wq" event={"ID":"ae077687-09f7-489f-8bbc-9a6d5b1babc3","Type":"ContainerStarted","Data":"94d011aabf4634c2ac96f9e1f10464adc6698b26cf2f2e581fd9f43ad5994cdc"} Feb 27 07:40:54 crc kubenswrapper[4725]: I0227 07:40:54.338889 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sz6wq" podStartSLOduration=2.8727211009999998 podStartE2EDuration="7.338874694s" podCreationTimestamp="2026-02-27 07:40:47 +0000 UTC" firstStartedPulling="2026-02-27 07:40:49.232386388 +0000 UTC m=+5427.695006967" lastFinishedPulling="2026-02-27 07:40:53.698539981 +0000 UTC m=+5432.161160560" observedRunningTime="2026-02-27 07:40:54.334424449 +0000 UTC m=+5432.797045018" watchObservedRunningTime="2026-02-27 07:40:54.338874694 +0000 UTC m=+5432.801495253" Feb 27 07:40:58 crc kubenswrapper[4725]: I0227 07:40:58.138209 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:58 crc kubenswrapper[4725]: I0227 07:40:58.138659 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:40:59 crc kubenswrapper[4725]: I0227 07:40:59.191779 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sz6wq" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="registry-server" probeResult="failure" output=< Feb 27 07:40:59 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:40:59 crc kubenswrapper[4725]: > Feb 27 07:40:59 crc kubenswrapper[4725]: I0227 07:40:59.353998 4725 generic.go:334] "Generic (PLEG): container finished" podID="354c14f9-8a02-4676-b450-aebe2bfe0d52" containerID="cfd2460c2919f48499125689e29c35a5ef4eddeb928cd7f7a299da7ef5d29c2f" exitCode=0 Feb 27 07:40:59 crc kubenswrapper[4725]: I0227 07:40:59.354038 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" event={"ID":"354c14f9-8a02-4676-b450-aebe2bfe0d52","Type":"ContainerDied","Data":"cfd2460c2919f48499125689e29c35a5ef4eddeb928cd7f7a299da7ef5d29c2f"} Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.621480 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.689359 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-vmqrc"] Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.696255 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-vmqrc"] Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.774000 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvqhj\" (UniqueName: \"kubernetes.io/projected/354c14f9-8a02-4676-b450-aebe2bfe0d52-kube-api-access-zvqhj\") pod \"354c14f9-8a02-4676-b450-aebe2bfe0d52\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.774114 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/354c14f9-8a02-4676-b450-aebe2bfe0d52-host\") pod \"354c14f9-8a02-4676-b450-aebe2bfe0d52\" (UID: \"354c14f9-8a02-4676-b450-aebe2bfe0d52\") " Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.774782 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354c14f9-8a02-4676-b450-aebe2bfe0d52-host" (OuterVolumeSpecName: "host") pod "354c14f9-8a02-4676-b450-aebe2bfe0d52" (UID: "354c14f9-8a02-4676-b450-aebe2bfe0d52"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.804624 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354c14f9-8a02-4676-b450-aebe2bfe0d52-kube-api-access-zvqhj" (OuterVolumeSpecName: "kube-api-access-zvqhj") pod "354c14f9-8a02-4676-b450-aebe2bfe0d52" (UID: "354c14f9-8a02-4676-b450-aebe2bfe0d52"). InnerVolumeSpecName "kube-api-access-zvqhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.877256 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/354c14f9-8a02-4676-b450-aebe2bfe0d52-host\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:00 crc kubenswrapper[4725]: I0227 07:41:00.877301 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvqhj\" (UniqueName: \"kubernetes.io/projected/354c14f9-8a02-4676-b450-aebe2bfe0d52-kube-api-access-zvqhj\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:01 crc kubenswrapper[4725]: I0227 07:41:01.376849 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="314a6367aa87a7df4c66575f18358ac92385c51f7ea3d2a88f0d6e080f3e3d1a" Feb 27 07:41:01 crc kubenswrapper[4725]: I0227 07:41:01.376907 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-vmqrc" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.063105 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-mgmtw"] Feb 27 07:41:02 crc kubenswrapper[4725]: E0227 07:41:02.063879 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354c14f9-8a02-4676-b450-aebe2bfe0d52" containerName="container-00" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.063895 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="354c14f9-8a02-4676-b450-aebe2bfe0d52" containerName="container-00" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.064220 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="354c14f9-8a02-4676-b450-aebe2bfe0d52" containerName="container-00" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.065055 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.209548 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68f07cc4-b003-434b-a345-b122d90a6a8c-host\") pod \"crc-debug-mgmtw\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.209694 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xztdh\" (UniqueName: \"kubernetes.io/projected/68f07cc4-b003-434b-a345-b122d90a6a8c-kube-api-access-xztdh\") pod \"crc-debug-mgmtw\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.267513 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354c14f9-8a02-4676-b450-aebe2bfe0d52" path="/var/lib/kubelet/pods/354c14f9-8a02-4676-b450-aebe2bfe0d52/volumes" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.312035 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68f07cc4-b003-434b-a345-b122d90a6a8c-host\") pod \"crc-debug-mgmtw\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.312175 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xztdh\" (UniqueName: \"kubernetes.io/projected/68f07cc4-b003-434b-a345-b122d90a6a8c-kube-api-access-xztdh\") pod \"crc-debug-mgmtw\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.312708 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68f07cc4-b003-434b-a345-b122d90a6a8c-host\") pod \"crc-debug-mgmtw\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.476960 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xztdh\" (UniqueName: \"kubernetes.io/projected/68f07cc4-b003-434b-a345-b122d90a6a8c-kube-api-access-xztdh\") pod \"crc-debug-mgmtw\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:02 crc kubenswrapper[4725]: I0227 07:41:02.681917 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:03 crc kubenswrapper[4725]: I0227 07:41:03.395930 4725 generic.go:334] "Generic (PLEG): container finished" podID="68f07cc4-b003-434b-a345-b122d90a6a8c" containerID="d005b9327468b2a7dc3436d95993d49c934a180d47b7adfff1e5159758f30810" exitCode=0 Feb 27 07:41:03 crc kubenswrapper[4725]: I0227 07:41:03.396590 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" event={"ID":"68f07cc4-b003-434b-a345-b122d90a6a8c","Type":"ContainerDied","Data":"d005b9327468b2a7dc3436d95993d49c934a180d47b7adfff1e5159758f30810"} Feb 27 07:41:03 crc kubenswrapper[4725]: I0227 07:41:03.396655 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" event={"ID":"68f07cc4-b003-434b-a345-b122d90a6a8c","Type":"ContainerStarted","Data":"a80ea9f041b6397050486be33913f8301631b06f6a496d4fb9f5a31c8c0c106b"} Feb 27 07:41:04 crc kubenswrapper[4725]: I0227 07:41:04.536198 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:04 crc kubenswrapper[4725]: I0227 07:41:04.652033 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xztdh\" (UniqueName: \"kubernetes.io/projected/68f07cc4-b003-434b-a345-b122d90a6a8c-kube-api-access-xztdh\") pod \"68f07cc4-b003-434b-a345-b122d90a6a8c\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " Feb 27 07:41:04 crc kubenswrapper[4725]: I0227 07:41:04.652140 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68f07cc4-b003-434b-a345-b122d90a6a8c-host\") pod \"68f07cc4-b003-434b-a345-b122d90a6a8c\" (UID: \"68f07cc4-b003-434b-a345-b122d90a6a8c\") " Feb 27 07:41:04 crc kubenswrapper[4725]: I0227 07:41:04.652668 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68f07cc4-b003-434b-a345-b122d90a6a8c-host" (OuterVolumeSpecName: "host") pod "68f07cc4-b003-434b-a345-b122d90a6a8c" (UID: "68f07cc4-b003-434b-a345-b122d90a6a8c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 07:41:04 crc kubenswrapper[4725]: I0227 07:41:04.671723 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f07cc4-b003-434b-a345-b122d90a6a8c-kube-api-access-xztdh" (OuterVolumeSpecName: "kube-api-access-xztdh") pod "68f07cc4-b003-434b-a345-b122d90a6a8c" (UID: "68f07cc4-b003-434b-a345-b122d90a6a8c"). InnerVolumeSpecName "kube-api-access-xztdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:41:04 crc kubenswrapper[4725]: I0227 07:41:04.754528 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68f07cc4-b003-434b-a345-b122d90a6a8c-host\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:04 crc kubenswrapper[4725]: I0227 07:41:04.754561 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xztdh\" (UniqueName: \"kubernetes.io/projected/68f07cc4-b003-434b-a345-b122d90a6a8c-kube-api-access-xztdh\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:05 crc kubenswrapper[4725]: I0227 07:41:05.415006 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" event={"ID":"68f07cc4-b003-434b-a345-b122d90a6a8c","Type":"ContainerDied","Data":"a80ea9f041b6397050486be33913f8301631b06f6a496d4fb9f5a31c8c0c106b"} Feb 27 07:41:05 crc kubenswrapper[4725]: I0227 07:41:05.415260 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a80ea9f041b6397050486be33913f8301631b06f6a496d4fb9f5a31c8c0c106b" Feb 27 07:41:05 crc kubenswrapper[4725]: I0227 07:41:05.415067 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-mgmtw" Feb 27 07:41:05 crc kubenswrapper[4725]: I0227 07:41:05.475927 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-mgmtw"] Feb 27 07:41:05 crc kubenswrapper[4725]: I0227 07:41:05.491817 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-mgmtw"] Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.263116 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f07cc4-b003-434b-a345-b122d90a6a8c" path="/var/lib/kubelet/pods/68f07cc4-b003-434b-a345-b122d90a6a8c/volumes" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.329054 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s85hn"] Feb 27 07:41:06 crc kubenswrapper[4725]: E0227 07:41:06.329530 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f07cc4-b003-434b-a345-b122d90a6a8c" containerName="container-00" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.329547 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f07cc4-b003-434b-a345-b122d90a6a8c" containerName="container-00" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.329766 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f07cc4-b003-434b-a345-b122d90a6a8c" containerName="container-00" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.331449 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.339716 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s85hn"] Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.487114 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5hqf\" (UniqueName: \"kubernetes.io/projected/6c9e000b-7236-4d73-ae27-33358ad3544c-kube-api-access-p5hqf\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.487196 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-utilities\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.487407 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-catalog-content\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.588967 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5hqf\" (UniqueName: \"kubernetes.io/projected/6c9e000b-7236-4d73-ae27-33358ad3544c-kube-api-access-p5hqf\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.589034 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-utilities\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.589146 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-catalog-content\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.589632 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-catalog-content\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.589758 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-utilities\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.619176 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5hqf\" (UniqueName: \"kubernetes.io/projected/6c9e000b-7236-4d73-ae27-33358ad3544c-kube-api-access-p5hqf\") pod \"certified-operators-s85hn\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.668413 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.778659 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-dczfn"] Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.782058 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.900811 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvqkf\" (UniqueName: \"kubernetes.io/projected/2e05d49b-46d9-4497-9b99-1927f565d50e-kube-api-access-nvqkf\") pod \"crc-debug-dczfn\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:06 crc kubenswrapper[4725]: I0227 07:41:06.900885 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e05d49b-46d9-4497-9b99-1927f565d50e-host\") pod \"crc-debug-dczfn\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.002377 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvqkf\" (UniqueName: \"kubernetes.io/projected/2e05d49b-46d9-4497-9b99-1927f565d50e-kube-api-access-nvqkf\") pod \"crc-debug-dczfn\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.002443 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e05d49b-46d9-4497-9b99-1927f565d50e-host\") pod \"crc-debug-dczfn\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.002544 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e05d49b-46d9-4497-9b99-1927f565d50e-host\") pod \"crc-debug-dczfn\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.024875 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvqkf\" (UniqueName: \"kubernetes.io/projected/2e05d49b-46d9-4497-9b99-1927f565d50e-kube-api-access-nvqkf\") pod \"crc-debug-dczfn\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.142317 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:07 crc kubenswrapper[4725]: W0227 07:41:07.181324 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e05d49b_46d9_4497_9b99_1927f565d50e.slice/crio-115bb635817f72c550df1a80982755451afb8abf61d3b93aeef11a4c7df38b8b WatchSource:0}: Error finding container 115bb635817f72c550df1a80982755451afb8abf61d3b93aeef11a4c7df38b8b: Status 404 returned error can't find the container with id 115bb635817f72c550df1a80982755451afb8abf61d3b93aeef11a4c7df38b8b Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.258645 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s85hn"] Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.437453 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" event={"ID":"2e05d49b-46d9-4497-9b99-1927f565d50e","Type":"ContainerStarted","Data":"8db10346c3306cba26af4b13c87ba325d84fe6709bb75bc66527beaf545be7b9"} Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.437502 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" event={"ID":"2e05d49b-46d9-4497-9b99-1927f565d50e","Type":"ContainerStarted","Data":"115bb635817f72c550df1a80982755451afb8abf61d3b93aeef11a4c7df38b8b"} Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.441023 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85hn" event={"ID":"6c9e000b-7236-4d73-ae27-33358ad3544c","Type":"ContainerStarted","Data":"016ce6e8561423c6c46017f9ed4fac4fcc518f62ddfdbb158cf286068a8100c1"} Feb 27 07:41:07 crc kubenswrapper[4725]: I0227 07:41:07.468800 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" podStartSLOduration=1.46878085 podStartE2EDuration="1.46878085s" podCreationTimestamp="2026-02-27 07:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 07:41:07.465810326 +0000 UTC m=+5445.928430895" watchObservedRunningTime="2026-02-27 07:41:07.46878085 +0000 UTC m=+5445.931401419" Feb 27 07:41:08 crc kubenswrapper[4725]: I0227 07:41:08.207151 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:41:08 crc kubenswrapper[4725]: I0227 07:41:08.270209 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:41:08 crc kubenswrapper[4725]: I0227 07:41:08.450927 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerID="77f79f48bc34aa5b70e718352a0be824900092ec20efeba7e99a04be0bc3ed34" exitCode=0 Feb 27 07:41:08 crc kubenswrapper[4725]: I0227 07:41:08.451045 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85hn" event={"ID":"6c9e000b-7236-4d73-ae27-33358ad3544c","Type":"ContainerDied","Data":"77f79f48bc34aa5b70e718352a0be824900092ec20efeba7e99a04be0bc3ed34"} Feb 27 07:41:08 crc kubenswrapper[4725]: I0227 07:41:08.453747 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:41:08 crc kubenswrapper[4725]: I0227 07:41:08.454038 4725 generic.go:334] "Generic (PLEG): container finished" podID="2e05d49b-46d9-4497-9b99-1927f565d50e" containerID="8db10346c3306cba26af4b13c87ba325d84fe6709bb75bc66527beaf545be7b9" exitCode=0 Feb 27 07:41:08 crc kubenswrapper[4725]: I0227 07:41:08.454082 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" event={"ID":"2e05d49b-46d9-4497-9b99-1927f565d50e","Type":"ContainerDied","Data":"8db10346c3306cba26af4b13c87ba325d84fe6709bb75bc66527beaf545be7b9"} Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.471500 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85hn" event={"ID":"6c9e000b-7236-4d73-ae27-33358ad3544c","Type":"ContainerStarted","Data":"fbf55dbd05bef544130f705464987f9d48d8726080ea90111c37ecd45d7a395f"} Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.601399 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.641160 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-dczfn"] Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.650537 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hkvvl/crc-debug-dczfn"] Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.772820 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvqkf\" (UniqueName: \"kubernetes.io/projected/2e05d49b-46d9-4497-9b99-1927f565d50e-kube-api-access-nvqkf\") pod \"2e05d49b-46d9-4497-9b99-1927f565d50e\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.773107 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e05d49b-46d9-4497-9b99-1927f565d50e-host\") pod \"2e05d49b-46d9-4497-9b99-1927f565d50e\" (UID: \"2e05d49b-46d9-4497-9b99-1927f565d50e\") " Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.773241 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e05d49b-46d9-4497-9b99-1927f565d50e-host" (OuterVolumeSpecName: "host") pod "2e05d49b-46d9-4497-9b99-1927f565d50e" (UID: "2e05d49b-46d9-4497-9b99-1927f565d50e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 07:41:09 crc kubenswrapper[4725]: I0227 07:41:09.773781 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e05d49b-46d9-4497-9b99-1927f565d50e-host\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.185977 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e05d49b-46d9-4497-9b99-1927f565d50e-kube-api-access-nvqkf" (OuterVolumeSpecName: "kube-api-access-nvqkf") pod "2e05d49b-46d9-4497-9b99-1927f565d50e" (UID: "2e05d49b-46d9-4497-9b99-1927f565d50e"). InnerVolumeSpecName "kube-api-access-nvqkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.289218 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvqkf\" (UniqueName: \"kubernetes.io/projected/2e05d49b-46d9-4497-9b99-1927f565d50e-kube-api-access-nvqkf\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.289396 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e05d49b-46d9-4497-9b99-1927f565d50e" path="/var/lib/kubelet/pods/2e05d49b-46d9-4497-9b99-1927f565d50e/volumes" Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.485673 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerID="fbf55dbd05bef544130f705464987f9d48d8726080ea90111c37ecd45d7a395f" exitCode=0 Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.485775 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85hn" event={"ID":"6c9e000b-7236-4d73-ae27-33358ad3544c","Type":"ContainerDied","Data":"fbf55dbd05bef544130f705464987f9d48d8726080ea90111c37ecd45d7a395f"} Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.488404 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/crc-debug-dczfn" Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.488398 4725 scope.go:117] "RemoveContainer" containerID="8db10346c3306cba26af4b13c87ba325d84fe6709bb75bc66527beaf545be7b9" Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.943555 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sz6wq"] Feb 27 07:41:10 crc kubenswrapper[4725]: I0227 07:41:10.944643 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sz6wq" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="registry-server" containerID="cri-o://94d011aabf4634c2ac96f9e1f10464adc6698b26cf2f2e581fd9f43ad5994cdc" gracePeriod=2 Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.535729 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85hn" event={"ID":"6c9e000b-7236-4d73-ae27-33358ad3544c","Type":"ContainerStarted","Data":"7848d730f02d02dd9135efceeb2f16d269302b5768425848eff2757f6ae4386e"} Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.549542 4725 generic.go:334] "Generic (PLEG): container finished" podID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerID="94d011aabf4634c2ac96f9e1f10464adc6698b26cf2f2e581fd9f43ad5994cdc" exitCode=0 Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.549815 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz6wq" event={"ID":"ae077687-09f7-489f-8bbc-9a6d5b1babc3","Type":"ContainerDied","Data":"94d011aabf4634c2ac96f9e1f10464adc6698b26cf2f2e581fd9f43ad5994cdc"} Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.549923 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz6wq" event={"ID":"ae077687-09f7-489f-8bbc-9a6d5b1babc3","Type":"ContainerDied","Data":"35cd40c271e31a460a3d0f472c3a7467693970e00ddc47a54bbe942b205b46b7"} Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.550002 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35cd40c271e31a460a3d0f472c3a7467693970e00ddc47a54bbe942b205b46b7" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.559513 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.584824 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s85hn" podStartSLOduration=3.156826995 podStartE2EDuration="5.58480181s" podCreationTimestamp="2026-02-27 07:41:06 +0000 UTC" firstStartedPulling="2026-02-27 07:41:08.453463702 +0000 UTC m=+5446.916084281" lastFinishedPulling="2026-02-27 07:41:10.881438527 +0000 UTC m=+5449.344059096" observedRunningTime="2026-02-27 07:41:11.563250671 +0000 UTC m=+5450.025871230" watchObservedRunningTime="2026-02-27 07:41:11.58480181 +0000 UTC m=+5450.047422389" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.723574 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-catalog-content\") pod \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.723990 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2phw\" (UniqueName: \"kubernetes.io/projected/ae077687-09f7-489f-8bbc-9a6d5b1babc3-kube-api-access-b2phw\") pod \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.724190 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-utilities\") pod \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\" (UID: \"ae077687-09f7-489f-8bbc-9a6d5b1babc3\") " Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.725125 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-utilities" (OuterVolumeSpecName: "utilities") pod "ae077687-09f7-489f-8bbc-9a6d5b1babc3" (UID: "ae077687-09f7-489f-8bbc-9a6d5b1babc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.726111 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.741560 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae077687-09f7-489f-8bbc-9a6d5b1babc3-kube-api-access-b2phw" (OuterVolumeSpecName: "kube-api-access-b2phw") pod "ae077687-09f7-489f-8bbc-9a6d5b1babc3" (UID: "ae077687-09f7-489f-8bbc-9a6d5b1babc3"). InnerVolumeSpecName "kube-api-access-b2phw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.776478 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae077687-09f7-489f-8bbc-9a6d5b1babc3" (UID: "ae077687-09f7-489f-8bbc-9a6d5b1babc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.828105 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae077687-09f7-489f-8bbc-9a6d5b1babc3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:11 crc kubenswrapper[4725]: I0227 07:41:11.828320 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2phw\" (UniqueName: \"kubernetes.io/projected/ae077687-09f7-489f-8bbc-9a6d5b1babc3-kube-api-access-b2phw\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:12 crc kubenswrapper[4725]: I0227 07:41:12.559421 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz6wq" Feb 27 07:41:12 crc kubenswrapper[4725]: I0227 07:41:12.589108 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sz6wq"] Feb 27 07:41:12 crc kubenswrapper[4725]: I0227 07:41:12.600843 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sz6wq"] Feb 27 07:41:14 crc kubenswrapper[4725]: I0227 07:41:14.266071 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" path="/var/lib/kubelet/pods/ae077687-09f7-489f-8bbc-9a6d5b1babc3/volumes" Feb 27 07:41:16 crc kubenswrapper[4725]: I0227 07:41:16.681336 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:16 crc kubenswrapper[4725]: I0227 07:41:16.681917 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:16 crc kubenswrapper[4725]: I0227 07:41:16.731339 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:17 crc kubenswrapper[4725]: I0227 07:41:17.656026 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:20 crc kubenswrapper[4725]: I0227 07:41:20.322227 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s85hn"] Feb 27 07:41:20 crc kubenswrapper[4725]: I0227 07:41:20.323471 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s85hn" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="registry-server" containerID="cri-o://7848d730f02d02dd9135efceeb2f16d269302b5768425848eff2757f6ae4386e" gracePeriod=2 Feb 27 07:41:20 crc kubenswrapper[4725]: I0227 07:41:20.642541 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerID="7848d730f02d02dd9135efceeb2f16d269302b5768425848eff2757f6ae4386e" exitCode=0 Feb 27 07:41:20 crc kubenswrapper[4725]: I0227 07:41:20.642603 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85hn" event={"ID":"6c9e000b-7236-4d73-ae27-33358ad3544c","Type":"ContainerDied","Data":"7848d730f02d02dd9135efceeb2f16d269302b5768425848eff2757f6ae4386e"} Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.363912 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.433862 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-utilities\") pod \"6c9e000b-7236-4d73-ae27-33358ad3544c\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.433925 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5hqf\" (UniqueName: \"kubernetes.io/projected/6c9e000b-7236-4d73-ae27-33358ad3544c-kube-api-access-p5hqf\") pod \"6c9e000b-7236-4d73-ae27-33358ad3544c\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.433943 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-catalog-content\") pod \"6c9e000b-7236-4d73-ae27-33358ad3544c\" (UID: \"6c9e000b-7236-4d73-ae27-33358ad3544c\") " Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.440851 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-utilities" (OuterVolumeSpecName: "utilities") pod "6c9e000b-7236-4d73-ae27-33358ad3544c" (UID: "6c9e000b-7236-4d73-ae27-33358ad3544c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.467202 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9e000b-7236-4d73-ae27-33358ad3544c-kube-api-access-p5hqf" (OuterVolumeSpecName: "kube-api-access-p5hqf") pod "6c9e000b-7236-4d73-ae27-33358ad3544c" (UID: "6c9e000b-7236-4d73-ae27-33358ad3544c"). InnerVolumeSpecName "kube-api-access-p5hqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.504557 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c9e000b-7236-4d73-ae27-33358ad3544c" (UID: "6c9e000b-7236-4d73-ae27-33358ad3544c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.536850 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.536917 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5hqf\" (UniqueName: \"kubernetes.io/projected/6c9e000b-7236-4d73-ae27-33358ad3544c-kube-api-access-p5hqf\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.536931 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9e000b-7236-4d73-ae27-33358ad3544c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.654451 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s85hn" event={"ID":"6c9e000b-7236-4d73-ae27-33358ad3544c","Type":"ContainerDied","Data":"016ce6e8561423c6c46017f9ed4fac4fcc518f62ddfdbb158cf286068a8100c1"} Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.654535 4725 scope.go:117] "RemoveContainer" containerID="7848d730f02d02dd9135efceeb2f16d269302b5768425848eff2757f6ae4386e" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.654755 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s85hn" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.676481 4725 scope.go:117] "RemoveContainer" containerID="fbf55dbd05bef544130f705464987f9d48d8726080ea90111c37ecd45d7a395f" Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.694159 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s85hn"] Feb 27 07:41:21 crc kubenswrapper[4725]: I0227 07:41:21.705419 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s85hn"] Feb 27 07:41:22 crc kubenswrapper[4725]: I0227 07:41:22.206847 4725 scope.go:117] "RemoveContainer" containerID="77f79f48bc34aa5b70e718352a0be824900092ec20efeba7e99a04be0bc3ed34" Feb 27 07:41:22 crc kubenswrapper[4725]: I0227 07:41:22.280167 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" path="/var/lib/kubelet/pods/6c9e000b-7236-4d73-ae27-33358ad3544c/volumes" Feb 27 07:41:48 crc kubenswrapper[4725]: I0227 07:41:48.557999 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7959dbc8c4-8fc74_db039076-6d42-4d4e-b0d2-479ae5a91408/barbican-api/0.log" Feb 27 07:41:48 crc kubenswrapper[4725]: I0227 07:41:48.583732 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7959dbc8c4-8fc74_db039076-6d42-4d4e-b0d2-479ae5a91408/barbican-api-log/0.log" Feb 27 07:41:48 crc kubenswrapper[4725]: I0227 07:41:48.733859 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64b777b644-7s9mh_d495df58-14fc-4eb9-a8f1-104b6ca6ce22/barbican-keystone-listener/0.log" Feb 27 07:41:48 crc kubenswrapper[4725]: I0227 07:41:48.804641 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-545fff646c-dqt5j_e4edb4e2-0feb-4075-a823-c02d954872d3/barbican-worker/0.log" Feb 27 07:41:48 crc kubenswrapper[4725]: I0227 07:41:48.880779 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64b777b644-7s9mh_d495df58-14fc-4eb9-a8f1-104b6ca6ce22/barbican-keystone-listener-log/0.log" Feb 27 07:41:48 crc kubenswrapper[4725]: I0227 07:41:48.952759 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-545fff646c-dqt5j_e4edb4e2-0feb-4075-a823-c02d954872d3/barbican-worker-log/0.log" Feb 27 07:41:49 crc kubenswrapper[4725]: I0227 07:41:49.100497 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74_6bd144c7-5dac-46d4-8cab-b3a31a352974/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:49 crc kubenswrapper[4725]: I0227 07:41:49.262602 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/ceilometer-central-agent/0.log" Feb 27 07:41:49 crc kubenswrapper[4725]: I0227 07:41:49.305050 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/ceilometer-notification-agent/0.log" Feb 27 07:41:49 crc kubenswrapper[4725]: I0227 07:41:49.367909 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/proxy-httpd/0.log" Feb 27 07:41:49 crc kubenswrapper[4725]: I0227 07:41:49.403201 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/sg-core/0.log" Feb 27 07:41:49 crc kubenswrapper[4725]: I0227 07:41:49.597344 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_819ee261-b129-4874-8f16-5f505d7b3c01/cinder-api-log/0.log" Feb 27 07:41:49 crc kubenswrapper[4725]: I0227 07:41:49.994325 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_819ee261-b129-4874-8f16-5f505d7b3c01/cinder-api/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.162467 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_1c1a66bf-70db-4738-ae7d-4fd930ec4f4d/probe/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.215177 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b71d8cfd-c55f-43fd-b7b7-90c063488103/cinder-scheduler/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.427852 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b71d8cfd-c55f-43fd-b7b7-90c063488103/probe/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.450908 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_1c1a66bf-70db-4738-ae7d-4fd930ec4f4d/cinder-backup/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.638588 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_ba6ad5a5-a980-46a3-8891-5448144c7885/cinder-volume/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.677894 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_ba6ad5a5-a980-46a3-8891-5448144c7885/probe/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.878768 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_be6347d4-c8d8-416d-9229-9671f6a027d4/cinder-volume/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.893596 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_be6347d4-c8d8-416d-9229-9671f6a027d4/probe/0.log" Feb 27 07:41:50 crc kubenswrapper[4725]: I0227 07:41:50.985894 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pj85v_311ba5d5-8172-405b-aead-458a7149e826/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:51 crc kubenswrapper[4725]: I0227 07:41:51.105885 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn_8b00cf98-bb69-4c5e-8f34-e862f1acf329/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:51 crc kubenswrapper[4725]: I0227 07:41:51.220567 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-584644fbc5-9wt8c_77a07f13-4e0b-4d51-9e35-2787348e7a63/init/0.log" Feb 27 07:41:51 crc kubenswrapper[4725]: I0227 07:41:51.416487 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-584644fbc5-9wt8c_77a07f13-4e0b-4d51-9e35-2787348e7a63/init/0.log" Feb 27 07:41:51 crc kubenswrapper[4725]: I0227 07:41:51.464051 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9_1fab6c47-9849-428c-96a3-96c4cac71f69/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:51 crc kubenswrapper[4725]: I0227 07:41:51.555297 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-584644fbc5-9wt8c_77a07f13-4e0b-4d51-9e35-2787348e7a63/dnsmasq-dns/0.log" Feb 27 07:41:51 crc kubenswrapper[4725]: I0227 07:41:51.685188 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4b3b10b4-8a3a-492c-97ce-9ae74040d8ae/glance-log/0.log" Feb 27 07:41:51 crc kubenswrapper[4725]: I0227 07:41:51.718655 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4b3b10b4-8a3a-492c-97ce-9ae74040d8ae/glance-httpd/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.084614 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41a3d13f-cb8c-42cc-aa8e-12d09fe458f1/glance-httpd/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.136768 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41a3d13f-cb8c-42cc-aa8e-12d09fe458f1/glance-log/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.342546 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f478fcd58-cfjzp_372d4de4-ea8f-4393-af8b-1139e593ac16/horizon/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.380662 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n_8732fb9d-c8a7-4cb3-acca-83301a2c03dc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.594218 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-64s2r_bf027694-e689-4cb8-aaf6-3e848ec2de4b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.803598 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29536261-jgbdj_c53bb79f-c970-4c9e-9a11-c8961e8041ce/keystone-cron/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.952332 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9d1d1822-20db-4d79-9ba2-0746292596c6/kube-state-metrics/0.log" Feb 27 07:41:52 crc kubenswrapper[4725]: I0227 07:41:52.953171 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f478fcd58-cfjzp_372d4de4-ea8f-4393-af8b-1139e593ac16/horizon-log/0.log" Feb 27 07:41:53 crc kubenswrapper[4725]: I0227 07:41:53.134278 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr_501e41e3-55eb-4b62-b4b5-67f594761a64/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:53 crc kubenswrapper[4725]: I0227 07:41:53.137792 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6fccbd6487-2trpv_c5d7d934-34b3-46a7-94d2-0803780d5837/keystone-api/0.log" Feb 27 07:41:53 crc kubenswrapper[4725]: I0227 07:41:53.609795 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5_b5bb9130-cfdc-481b-8e8a-c72f5562b963/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:53 crc kubenswrapper[4725]: I0227 07:41:53.634662 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b86d8c849-9kc54_bdb517d6-290d-43f7-9791-297c8dace84e/neutron-httpd/0.log" Feb 27 07:41:53 crc kubenswrapper[4725]: I0227 07:41:53.735201 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b86d8c849-9kc54_bdb517d6-290d-43f7-9791-297c8dace84e/neutron-api/0.log" Feb 27 07:41:53 crc kubenswrapper[4725]: I0227 07:41:53.805573 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_bc2bb345-ef60-4c05-8461-1821e1db5216/setup-container/0.log" Feb 27 07:41:54 crc kubenswrapper[4725]: I0227 07:41:54.008655 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_bc2bb345-ef60-4c05-8461-1821e1db5216/setup-container/0.log" Feb 27 07:41:54 crc kubenswrapper[4725]: I0227 07:41:54.102936 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_bc2bb345-ef60-4c05-8461-1821e1db5216/rabbitmq/0.log" Feb 27 07:41:54 crc kubenswrapper[4725]: I0227 07:41:54.715790 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_20427622-030f-4f0a-870e-6119d307befa/nova-cell0-conductor-conductor/0.log" Feb 27 07:41:54 crc kubenswrapper[4725]: I0227 07:41:54.953234 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_25ef4822-4a88-4b23-8c61-03d89105d848/nova-cell1-conductor-conductor/0.log" Feb 27 07:41:55 crc kubenswrapper[4725]: I0227 07:41:55.360137 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1/nova-cell1-novncproxy-novncproxy/0.log" Feb 27 07:41:55 crc kubenswrapper[4725]: I0227 07:41:55.544991 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wjx2k_8ac7b33c-a85a-436b-b4c1-560c074fab9b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:55 crc kubenswrapper[4725]: I0227 07:41:55.571862 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a58f39-222d-495a-9cde-272e31f1efae/nova-api-log/0.log" Feb 27 07:41:55 crc kubenswrapper[4725]: I0227 07:41:55.804416 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7a660f84-32ef-4def-90b6-fd4a39e117dc/nova-metadata-log/0.log" Feb 27 07:41:55 crc kubenswrapper[4725]: I0227 07:41:55.814500 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a58f39-222d-495a-9cde-272e31f1efae/nova-api-api/0.log" Feb 27 07:41:56 crc kubenswrapper[4725]: I0227 07:41:56.055459 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76702ae7-c9e6-485b-abc9-b54e4c073ee1/mysql-bootstrap/0.log" Feb 27 07:41:56 crc kubenswrapper[4725]: I0227 07:41:56.296456 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76702ae7-c9e6-485b-abc9-b54e4c073ee1/mysql-bootstrap/0.log" Feb 27 07:41:56 crc kubenswrapper[4725]: I0227 07:41:56.323356 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d39eae0d-a597-445f-9134-7e2d9f5e82ff/nova-scheduler-scheduler/0.log" Feb 27 07:41:56 crc kubenswrapper[4725]: I0227 07:41:56.335504 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76702ae7-c9e6-485b-abc9-b54e4c073ee1/galera/0.log" Feb 27 07:41:56 crc kubenswrapper[4725]: I0227 07:41:56.536764 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7f24b5c8-baad-48b6-9242-2ad6bb6c471f/mysql-bootstrap/0.log" Feb 27 07:41:56 crc kubenswrapper[4725]: I0227 07:41:56.807901 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7f24b5c8-baad-48b6-9242-2ad6bb6c471f/galera/0.log" Feb 27 07:41:56 crc kubenswrapper[4725]: I0227 07:41:56.850429 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7f24b5c8-baad-48b6-9242-2ad6bb6c471f/mysql-bootstrap/0.log" Feb 27 07:41:57 crc kubenswrapper[4725]: I0227 07:41:57.023027 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6kvbc_03406108-89c6-4681-aeba-c6874d465b62/ovn-controller/0.log" Feb 27 07:41:57 crc kubenswrapper[4725]: I0227 07:41:57.066436 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6c9af008-ad8e-4eaa-b631-543a0ef1bb00/openstackclient/0.log" Feb 27 07:41:57 crc kubenswrapper[4725]: I0227 07:41:57.652663 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7a660f84-32ef-4def-90b6-fd4a39e117dc/nova-metadata-metadata/0.log" Feb 27 07:41:57 crc kubenswrapper[4725]: I0227 07:41:57.806087 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-s2ht5_474555a6-7d91-4881-a4c7-785ccf8185cc/openstack-network-exporter/0.log" Feb 27 07:41:57 crc kubenswrapper[4725]: I0227 07:41:57.838616 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovsdb-server-init/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.003619 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovsdb-server/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.045760 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovsdb-server-init/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.257006 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5c4sz_741a3436-861d-4cb0-925e-597423d841a9/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.317140 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_037dd431-5912-4101-9895-0a6d11e627a6/openstack-network-exporter/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.360592 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovs-vswitchd/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.489253 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_037dd431-5912-4101-9895-0a6d11e627a6/ovn-northd/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.522487 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_67b7afed-e3d9-42c8-9604-9d9e56f1bc1d/openstack-network-exporter/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.607915 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_67b7afed-e3d9-42c8-9604-9d9e56f1bc1d/ovsdbserver-nb/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.782707 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8a3fa421-de83-44cb-8857-ef6f679f37dc/openstack-network-exporter/0.log" Feb 27 07:41:58 crc kubenswrapper[4725]: I0227 07:41:58.827044 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8a3fa421-de83-44cb-8857-ef6f679f37dc/ovsdbserver-sb/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.132181 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/init-config-reloader/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.167471 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-745fdc9fb8-jhz6h_3f553c85-a79e-4317-9140-708bda9525e2/placement-api/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.238417 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-745fdc9fb8-jhz6h_3f553c85-a79e-4317-9140-708bda9525e2/placement-log/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.757085 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_163ce132-3935-4648-b50f-fab5db3c17ca/memcached/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.869253 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/thanos-sidecar/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.872241 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/init-config-reloader/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.881402 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/prometheus/0.log" Feb 27 07:41:59 crc kubenswrapper[4725]: I0227 07:41:59.933242 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/config-reloader/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.072708 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89e15a0f-61a2-4114-b1cc-385f54f886d3/setup-container/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141014 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536302-c8jdf"] Feb 27 07:42:00 crc kubenswrapper[4725]: E0227 07:42:00.141540 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="extract-content" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141561 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="extract-content" Feb 27 07:42:00 crc kubenswrapper[4725]: E0227 07:42:00.141582 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="extract-utilities" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141591 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="extract-utilities" Feb 27 07:42:00 crc kubenswrapper[4725]: E0227 07:42:00.141616 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="registry-server" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141625 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="registry-server" Feb 27 07:42:00 crc kubenswrapper[4725]: E0227 07:42:00.141647 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="extract-utilities" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141654 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="extract-utilities" Feb 27 07:42:00 crc kubenswrapper[4725]: E0227 07:42:00.141672 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="registry-server" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141679 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="registry-server" Feb 27 07:42:00 crc kubenswrapper[4725]: E0227 07:42:00.141702 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e05d49b-46d9-4497-9b99-1927f565d50e" containerName="container-00" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141709 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e05d49b-46d9-4497-9b99-1927f565d50e" containerName="container-00" Feb 27 07:42:00 crc kubenswrapper[4725]: E0227 07:42:00.141721 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="extract-content" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141728 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="extract-content" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141958 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae077687-09f7-489f-8bbc-9a6d5b1babc3" containerName="registry-server" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.141979 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e05d49b-46d9-4497-9b99-1927f565d50e" containerName="container-00" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.142010 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9e000b-7236-4d73-ae27-33358ad3544c" containerName="registry-server" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.142861 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536302-c8jdf" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.145042 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.145359 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.146359 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.151978 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536302-c8jdf"] Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.231898 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89e15a0f-61a2-4114-b1cc-385f54f886d3/rabbitmq/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.268800 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e4112b6c-11e8-4244-9a39-c7474ffd192b/setup-container/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.274071 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89e15a0f-61a2-4114-b1cc-385f54f886d3/setup-container/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.322585 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btj2b\" (UniqueName: \"kubernetes.io/projected/d1782d6a-3aef-46f0-a4f6-2839fe3faead-kube-api-access-btj2b\") pod \"auto-csr-approver-29536302-c8jdf\" (UID: \"d1782d6a-3aef-46f0-a4f6-2839fe3faead\") " pod="openshift-infra/auto-csr-approver-29536302-c8jdf" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.424641 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btj2b\" (UniqueName: \"kubernetes.io/projected/d1782d6a-3aef-46f0-a4f6-2839fe3faead-kube-api-access-btj2b\") pod \"auto-csr-approver-29536302-c8jdf\" (UID: \"d1782d6a-3aef-46f0-a4f6-2839fe3faead\") " pod="openshift-infra/auto-csr-approver-29536302-c8jdf" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.452572 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e4112b6c-11e8-4244-9a39-c7474ffd192b/rabbitmq/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.457061 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btj2b\" (UniqueName: \"kubernetes.io/projected/d1782d6a-3aef-46f0-a4f6-2839fe3faead-kube-api-access-btj2b\") pod \"auto-csr-approver-29536302-c8jdf\" (UID: \"d1782d6a-3aef-46f0-a4f6-2839fe3faead\") " pod="openshift-infra/auto-csr-approver-29536302-c8jdf" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.491558 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536302-c8jdf" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.518856 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e4112b6c-11e8-4244-9a39-c7474ffd192b/setup-container/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.524523 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77_9ac46847-17bf-49e5-ae76-1ea3af18c9f5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.819207 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-55xwm_8c8e8aea-4c46-4fe2-844f-2c51d7662fa6/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.821223 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7qtxw_3567a664-44a4-4138-82ec-f35dbffffb40/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.839561 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g_a5ce3d2f-4b00-4971-a37f-3217fd19665a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:42:00 crc kubenswrapper[4725]: I0227 07:42:00.986489 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536302-c8jdf"] Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.026395 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536302-c8jdf" event={"ID":"d1782d6a-3aef-46f0-a4f6-2839fe3faead","Type":"ContainerStarted","Data":"00041cca8b15b3fdd02431681e60aa2f89fb6c7c9cd90a7488b59ad8a9f3b5c9"} Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.044436 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ctvww_22798438-191d-4ecf-ab5d-23af37e208b3/ssh-known-hosts-edpm-deployment/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.166386 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-559f68776c-7cj2d_39aed367-30f0-4ebd-a057-e33e50a6f748/proxy-server/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.177162 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-559f68776c-7cj2d_39aed367-30f0-4ebd-a057-e33e50a6f748/proxy-httpd/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.247760 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bq24l_40a2ae59-8725-42be-984a-739a82d476c5/swift-ring-rebalance/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.391984 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-reaper/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.411167 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-auditor/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.483302 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-server/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.493705 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-replicator/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.592554 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-auditor/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.620603 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-replicator/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.623612 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-server/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.711356 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-updater/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.725542 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-auditor/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.765986 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-expirer/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.839971 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-server/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.858585 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-replicator/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.906217 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-updater/0.log" Feb 27 07:42:01 crc kubenswrapper[4725]: I0227 07:42:01.941672 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/rsync/0.log" Feb 27 07:42:02 crc kubenswrapper[4725]: I0227 07:42:02.019110 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/swift-recon-cron/0.log" Feb 27 07:42:02 crc kubenswrapper[4725]: I0227 07:42:02.106248 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph_9898977d-f2bf-4be4-9b90-82fbcc11ba8b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:42:02 crc kubenswrapper[4725]: I0227 07:42:02.263687 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a4ededce-4af9-418c-af09-c79e79cb044f/tempest-tests-tempest-tests-runner/0.log" Feb 27 07:42:02 crc kubenswrapper[4725]: I0227 07:42:02.316053 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1ef86f5d-c8f3-4077-8184-4aecfa313695/test-operator-logs-container/0.log" Feb 27 07:42:02 crc kubenswrapper[4725]: I0227 07:42:02.381828 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-85q2p_308aa3d5-1a73-49da-98ae-a723be6a9c31/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:42:02 crc kubenswrapper[4725]: I0227 07:42:02.554158 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:42:02 crc kubenswrapper[4725]: I0227 07:42:02.554490 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:42:03 crc kubenswrapper[4725]: I0227 07:42:03.043819 4725 generic.go:334] "Generic (PLEG): container finished" podID="d1782d6a-3aef-46f0-a4f6-2839fe3faead" containerID="0db8397becd726dd055bbebf913628d5a7cd8c585d9e43c5d90caca7e03a6cf2" exitCode=0 Feb 27 07:42:03 crc kubenswrapper[4725]: I0227 07:42:03.043864 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536302-c8jdf" event={"ID":"d1782d6a-3aef-46f0-a4f6-2839fe3faead","Type":"ContainerDied","Data":"0db8397becd726dd055bbebf913628d5a7cd8c585d9e43c5d90caca7e03a6cf2"} Feb 27 07:42:03 crc kubenswrapper[4725]: I0227 07:42:03.106158 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_b2f1b2e7-bd25-401a-ae31-c49984f2c438/watcher-applier/0.log" Feb 27 07:42:03 crc kubenswrapper[4725]: I0227 07:42:03.874089 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8ca0e165-57b9-4dbd-a8a8-e036ba316122/watcher-api-log/0.log" Feb 27 07:42:04 crc kubenswrapper[4725]: I0227 07:42:04.413725 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536302-c8jdf" Feb 27 07:42:04 crc kubenswrapper[4725]: I0227 07:42:04.510982 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btj2b\" (UniqueName: \"kubernetes.io/projected/d1782d6a-3aef-46f0-a4f6-2839fe3faead-kube-api-access-btj2b\") pod \"d1782d6a-3aef-46f0-a4f6-2839fe3faead\" (UID: \"d1782d6a-3aef-46f0-a4f6-2839fe3faead\") " Feb 27 07:42:04 crc kubenswrapper[4725]: I0227 07:42:04.524531 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1782d6a-3aef-46f0-a4f6-2839fe3faead-kube-api-access-btj2b" (OuterVolumeSpecName: "kube-api-access-btj2b") pod "d1782d6a-3aef-46f0-a4f6-2839fe3faead" (UID: "d1782d6a-3aef-46f0-a4f6-2839fe3faead"). InnerVolumeSpecName "kube-api-access-btj2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:42:04 crc kubenswrapper[4725]: I0227 07:42:04.614534 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btj2b\" (UniqueName: \"kubernetes.io/projected/d1782d6a-3aef-46f0-a4f6-2839fe3faead-kube-api-access-btj2b\") on node \"crc\" DevicePath \"\"" Feb 27 07:42:05 crc kubenswrapper[4725]: I0227 07:42:05.061713 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536302-c8jdf" event={"ID":"d1782d6a-3aef-46f0-a4f6-2839fe3faead","Type":"ContainerDied","Data":"00041cca8b15b3fdd02431681e60aa2f89fb6c7c9cd90a7488b59ad8a9f3b5c9"} Feb 27 07:42:05 crc kubenswrapper[4725]: I0227 07:42:05.062014 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00041cca8b15b3fdd02431681e60aa2f89fb6c7c9cd90a7488b59ad8a9f3b5c9" Feb 27 07:42:05 crc kubenswrapper[4725]: I0227 07:42:05.061773 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536302-c8jdf" Feb 27 07:42:05 crc kubenswrapper[4725]: I0227 07:42:05.499550 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536296-9wzr4"] Feb 27 07:42:05 crc kubenswrapper[4725]: I0227 07:42:05.502434 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536296-9wzr4"] Feb 27 07:42:06 crc kubenswrapper[4725]: I0227 07:42:06.000949 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_a075032f-0182-44f6-8dd4-b190bf27ed02/watcher-decision-engine/0.log" Feb 27 07:42:06 crc kubenswrapper[4725]: I0227 07:42:06.282956 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bb4d0a-b66c-4493-b990-cd23305b481d" path="/var/lib/kubelet/pods/d8bb4d0a-b66c-4493-b990-cd23305b481d/volumes" Feb 27 07:42:06 crc kubenswrapper[4725]: I0227 07:42:06.697609 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8ca0e165-57b9-4dbd-a8a8-e036ba316122/watcher-api/0.log" Feb 27 07:42:12 crc kubenswrapper[4725]: I0227 07:42:12.625415 4725 scope.go:117] "RemoveContainer" containerID="435d07fddf7be4b6547ca3411980a39f1dc66b7188b15f072d2d5b671434ec5b" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.354528 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/util/0.log" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.478843 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/util/0.log" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.533019 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/pull/0.log" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.551667 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/pull/0.log" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.554379 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.554438 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.727518 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/pull/0.log" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.734958 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/util/0.log" Feb 27 07:42:32 crc kubenswrapper[4725]: I0227 07:42:32.822118 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/extract/0.log" Feb 27 07:42:33 crc kubenswrapper[4725]: I0227 07:42:33.190684 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-xkffm_f1fefb43-64d1-496a-be4b-042d68027526/manager/0.log" Feb 27 07:42:33 crc kubenswrapper[4725]: I0227 07:42:33.617218 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-48l7p_55b7330d-fa67-491c-9354-3ae2f377b245/manager/0.log" Feb 27 07:42:33 crc kubenswrapper[4725]: I0227 07:42:33.950801 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-pklxn_246fa0fd-dd91-4c17-9754-8ed71768660a/manager/0.log" Feb 27 07:42:34 crc kubenswrapper[4725]: I0227 07:42:34.087705 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-jjgd6_672a2ef1-a6d0-41f6-9bbf-5d157863ee48/manager/0.log" Feb 27 07:42:34 crc kubenswrapper[4725]: I0227 07:42:34.642330 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-wc9zj_9eeeac0e-6f80-4882-8d61-effa2342d69b/manager/0.log" Feb 27 07:42:34 crc kubenswrapper[4725]: I0227 07:42:34.990072 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-llwd8_119c1266-bd43-49d6-a39f-93abbf47c2be/manager/0.log" Feb 27 07:42:35 crc kubenswrapper[4725]: I0227 07:42:35.254203 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-4wg4t_1e6b09aa-e1b0-41c7-8aa0-e560de6310d5/manager/0.log" Feb 27 07:42:35 crc kubenswrapper[4725]: I0227 07:42:35.457120 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-zfhwz_76de952b-76db-47de-8891-40006493cf30/manager/0.log" Feb 27 07:42:35 crc kubenswrapper[4725]: I0227 07:42:35.792057 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-x8wnp_28697286-96cb-46ad-a4a5-acc3716aba31/manager/0.log" Feb 27 07:42:35 crc kubenswrapper[4725]: I0227 07:42:35.979937 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-pl57b_a90d813f-86f2-49c9-b7d2-66d44db8236c/manager/0.log" Feb 27 07:42:36 crc kubenswrapper[4725]: I0227 07:42:36.067667 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-4g4xh_1ec01345-1480-48b1-9d36-9dd8a9fc2ef8/manager/0.log" Feb 27 07:42:36 crc kubenswrapper[4725]: I0227 07:42:36.294098 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-pfq48_c32453ad-27be-4f95-bfc1-67878c36f13a/manager/0.log" Feb 27 07:42:36 crc kubenswrapper[4725]: I0227 07:42:36.324520 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-6d6fj_c6de99a3-3c54-4192-8cf6-fab2c5c9750b/manager/0.log" Feb 27 07:42:36 crc kubenswrapper[4725]: I0227 07:42:36.551255 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4_5fccc629-9a1d-4920-b3e7-817e49953fc1/manager/0.log" Feb 27 07:42:36 crc kubenswrapper[4725]: I0227 07:42:36.912450 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7544f859d8-744ft_cc86b762-a7df-42aa-970c-76ebac88b004/operator/0.log" Feb 27 07:42:36 crc kubenswrapper[4725]: I0227 07:42:36.948743 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r75bv_d6b63be0-6e6a-4e30-8648-28a0174338a4/registry-server/0.log" Feb 27 07:42:37 crc kubenswrapper[4725]: I0227 07:42:37.894172 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-vx95x_9972ea1a-4a28-4b7f-b511-9dd8dd3e0599/manager/0.log" Feb 27 07:42:37 crc kubenswrapper[4725]: I0227 07:42:37.913751 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-fgn4m_2e53de05-a35e-4ca4-9776-1492c5030554/manager/0.log" Feb 27 07:42:38 crc kubenswrapper[4725]: I0227 07:42:38.192026 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-zb72l_96664d14-2465-472a-b6c6-5589153d5ee3/operator/0.log" Feb 27 07:42:38 crc kubenswrapper[4725]: I0227 07:42:38.335753 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-m65cb_4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d/manager/0.log" Feb 27 07:42:38 crc kubenswrapper[4725]: I0227 07:42:38.622439 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-jp6zc_6f5e713b-cd6d-482f-8603-4dd47d2297d8/manager/0.log" Feb 27 07:42:38 crc kubenswrapper[4725]: I0227 07:42:38.811192 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-2sqrk_a9acda6b-5c71-406c-985e-c5e026b064c8/manager/0.log" Feb 27 07:42:38 crc kubenswrapper[4725]: I0227 07:42:38.910439 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cb5b7b9c5-kwj9k_31b25662-0274-4176-b3fd-4edd98517298/manager/0.log" Feb 27 07:42:39 crc kubenswrapper[4725]: I0227 07:42:39.023859 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c68576fd-g8db5_59d481fc-2689-420f-b779-c7d840fac75d/manager/0.log" Feb 27 07:42:44 crc kubenswrapper[4725]: I0227 07:42:44.655228 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-6sd5f_6e80b5f0-45bb-4081-808e-800527949f7e/manager/0.log" Feb 27 07:43:00 crc kubenswrapper[4725]: I0227 07:43:00.725178 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x796f_2aec175b-6e2b-4eac-a94f-771881386ffc/control-plane-machine-set-operator/0.log" Feb 27 07:43:00 crc kubenswrapper[4725]: I0227 07:43:00.928461 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-89pl9_682c856f-0661-4039-b071-e5c75267f3f1/kube-rbac-proxy/0.log" Feb 27 07:43:00 crc kubenswrapper[4725]: I0227 07:43:00.955849 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-89pl9_682c856f-0661-4039-b071-e5c75267f3f1/machine-api-operator/0.log" Feb 27 07:43:02 crc kubenswrapper[4725]: I0227 07:43:02.554448 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:43:02 crc kubenswrapper[4725]: I0227 07:43:02.554727 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:43:02 crc kubenswrapper[4725]: I0227 07:43:02.554764 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:43:02 crc kubenswrapper[4725]: I0227 07:43:02.555541 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3deb9a7f7c3458bea4b2fc28fe6aed78eaa0a969dd24bc29a40c059dd7e94a96"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:43:02 crc kubenswrapper[4725]: I0227 07:43:02.555593 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://3deb9a7f7c3458bea4b2fc28fe6aed78eaa0a969dd24bc29a40c059dd7e94a96" gracePeriod=600 Feb 27 07:43:03 crc kubenswrapper[4725]: I0227 07:43:03.656993 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="3deb9a7f7c3458bea4b2fc28fe6aed78eaa0a969dd24bc29a40c059dd7e94a96" exitCode=0 Feb 27 07:43:03 crc kubenswrapper[4725]: I0227 07:43:03.657071 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"3deb9a7f7c3458bea4b2fc28fe6aed78eaa0a969dd24bc29a40c059dd7e94a96"} Feb 27 07:43:03 crc kubenswrapper[4725]: I0227 07:43:03.657550 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee"} Feb 27 07:43:03 crc kubenswrapper[4725]: I0227 07:43:03.657571 4725 scope.go:117] "RemoveContainer" containerID="4213454f3cf911a4195b72dfa5d310a24edd0290f1e078dba53b13bd4d831d2c" Feb 27 07:43:14 crc kubenswrapper[4725]: I0227 07:43:14.489664 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4dbtd_1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9/cert-manager-controller/0.log" Feb 27 07:43:14 crc kubenswrapper[4725]: I0227 07:43:14.645128 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ggzrl_17417e2f-0dcc-4720-8766-65a0d193ae26/cert-manager-cainjector/0.log" Feb 27 07:43:14 crc kubenswrapper[4725]: I0227 07:43:14.673126 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-frkvb_4c233d31-e0c7-4e39-9092-7df4e4b23c96/cert-manager-webhook/0.log" Feb 27 07:43:29 crc kubenswrapper[4725]: I0227 07:43:29.498132 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-46jf6_450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea/nmstate-console-plugin/0.log" Feb 27 07:43:29 crc kubenswrapper[4725]: I0227 07:43:29.990745 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bdf7n_73608f57-a852-439f-82b8-364a37b0e88c/kube-rbac-proxy/0.log" Feb 27 07:43:29 crc kubenswrapper[4725]: I0227 07:43:29.998925 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z2kll_3f5fe9d7-0289-42a1-a991-3d0285038f72/nmstate-handler/0.log" Feb 27 07:43:30 crc kubenswrapper[4725]: I0227 07:43:30.018590 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bdf7n_73608f57-a852-439f-82b8-364a37b0e88c/nmstate-metrics/0.log" Feb 27 07:43:30 crc kubenswrapper[4725]: I0227 07:43:30.192383 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-6sx27_7b2eb7dd-736f-4c16-8630-ed2a8607e094/nmstate-webhook/0.log" Feb 27 07:43:30 crc kubenswrapper[4725]: I0227 07:43:30.224250 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-hgq5q_c0b93a17-8e40-4f49-94c7-cf241342c7be/nmstate-operator/0.log" Feb 27 07:43:46 crc kubenswrapper[4725]: I0227 07:43:46.493846 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vpgt6_9fefe362-2058-4721-930e-9651059cfcc8/prometheus-operator/0.log" Feb 27 07:43:46 crc kubenswrapper[4725]: I0227 07:43:46.530432 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74_7e243b80-5980-459f-ba42-90ebdd42e05b/prometheus-operator-admission-webhook/0.log" Feb 27 07:43:46 crc kubenswrapper[4725]: I0227 07:43:46.747175 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj_0f50c85e-bec7-4a58-9317-b86b3ba5e02c/prometheus-operator-admission-webhook/0.log" Feb 27 07:43:46 crc kubenswrapper[4725]: I0227 07:43:46.773725 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-x6b52_5810e280-be69-4236-9014-d459c65bd287/operator/0.log" Feb 27 07:43:46 crc kubenswrapper[4725]: I0227 07:43:46.919512 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9h7hk_0c2b0104-f94a-4e8a-bcd0-464ac8942f54/perses-operator/0.log" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.163753 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536304-qgfcj"] Feb 27 07:44:00 crc kubenswrapper[4725]: E0227 07:44:00.165587 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1782d6a-3aef-46f0-a4f6-2839fe3faead" containerName="oc" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.165616 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1782d6a-3aef-46f0-a4f6-2839fe3faead" containerName="oc" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.166063 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1782d6a-3aef-46f0-a4f6-2839fe3faead" containerName="oc" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.168060 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536304-qgfcj" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.171493 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.171736 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536304-qgfcj"] Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.172035 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.179405 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.298891 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkts\" (UniqueName: \"kubernetes.io/projected/cab9f9cf-83b9-4ebe-a3fe-f329e7233af2-kube-api-access-sjkts\") pod \"auto-csr-approver-29536304-qgfcj\" (UID: \"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2\") " pod="openshift-infra/auto-csr-approver-29536304-qgfcj" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.400395 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkts\" (UniqueName: \"kubernetes.io/projected/cab9f9cf-83b9-4ebe-a3fe-f329e7233af2-kube-api-access-sjkts\") pod \"auto-csr-approver-29536304-qgfcj\" (UID: \"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2\") " pod="openshift-infra/auto-csr-approver-29536304-qgfcj" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.420188 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkts\" (UniqueName: \"kubernetes.io/projected/cab9f9cf-83b9-4ebe-a3fe-f329e7233af2-kube-api-access-sjkts\") pod \"auto-csr-approver-29536304-qgfcj\" (UID: \"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2\") " pod="openshift-infra/auto-csr-approver-29536304-qgfcj" Feb 27 07:44:00 crc kubenswrapper[4725]: I0227 07:44:00.503332 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536304-qgfcj" Feb 27 07:44:01 crc kubenswrapper[4725]: I0227 07:44:01.044154 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536304-qgfcj"] Feb 27 07:44:01 crc kubenswrapper[4725]: I0227 07:44:01.255853 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536304-qgfcj" event={"ID":"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2","Type":"ContainerStarted","Data":"97fda118f35fec50b7cb591a2c1028a5ad85b179cddce909fc402168bae22d84"} Feb 27 07:44:01 crc kubenswrapper[4725]: I0227 07:44:01.822154 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-smlh9_ba636a80-6000-456f-a447-d754b6d0acd2/kube-rbac-proxy/0.log" Feb 27 07:44:01 crc kubenswrapper[4725]: I0227 07:44:01.898621 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-smlh9_ba636a80-6000-456f-a447-d754b6d0acd2/controller/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.060819 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.223765 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.242763 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.289095 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.329247 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.545528 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.565094 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.565144 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.591227 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.760447 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.764526 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/controller/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.784540 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.794573 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.939038 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/frr-metrics/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.964428 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/kube-rbac-proxy-frr/0.log" Feb 27 07:44:02 crc kubenswrapper[4725]: I0227 07:44:02.994276 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/kube-rbac-proxy/0.log" Feb 27 07:44:03 crc kubenswrapper[4725]: I0227 07:44:03.199582 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/reloader/0.log" Feb 27 07:44:03 crc kubenswrapper[4725]: I0227 07:44:03.279909 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-vsfpn_21527ec2-4ffa-49d2-9866-89690a83fa42/frr-k8s-webhook-server/0.log" Feb 27 07:44:03 crc kubenswrapper[4725]: I0227 07:44:03.286640 4725 generic.go:334] "Generic (PLEG): container finished" podID="cab9f9cf-83b9-4ebe-a3fe-f329e7233af2" containerID="d1fa97dd91f082933f57a487e9144ba9565e7aa529419ef8bcfa831b0e781002" exitCode=0 Feb 27 07:44:03 crc kubenswrapper[4725]: I0227 07:44:03.286676 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536304-qgfcj" event={"ID":"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2","Type":"ContainerDied","Data":"d1fa97dd91f082933f57a487e9144ba9565e7aa529419ef8bcfa831b0e781002"} Feb 27 07:44:03 crc kubenswrapper[4725]: I0227 07:44:03.452776 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-679f87455-8frdc_675e5722-7295-4f2f-acaa-7ad289facd96/manager/0.log" Feb 27 07:44:03 crc kubenswrapper[4725]: I0227 07:44:03.659606 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59c54c548d-8fzq9_78958b18-878f-4fce-b4b0-d799ed1225ce/webhook-server/0.log" Feb 27 07:44:03 crc kubenswrapper[4725]: I0227 07:44:03.731764 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r767c_0b817ed1-7c17-4e44-a421-c43b2c06ec64/kube-rbac-proxy/0.log" Feb 27 07:44:04 crc kubenswrapper[4725]: I0227 07:44:04.416208 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r767c_0b817ed1-7c17-4e44-a421-c43b2c06ec64/speaker/0.log" Feb 27 07:44:04 crc kubenswrapper[4725]: I0227 07:44:04.764839 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536304-qgfcj" Feb 27 07:44:04 crc kubenswrapper[4725]: I0227 07:44:04.805443 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjkts\" (UniqueName: \"kubernetes.io/projected/cab9f9cf-83b9-4ebe-a3fe-f329e7233af2-kube-api-access-sjkts\") pod \"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2\" (UID: \"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2\") " Feb 27 07:44:04 crc kubenswrapper[4725]: I0227 07:44:04.818609 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cab9f9cf-83b9-4ebe-a3fe-f329e7233af2-kube-api-access-sjkts" (OuterVolumeSpecName: "kube-api-access-sjkts") pod "cab9f9cf-83b9-4ebe-a3fe-f329e7233af2" (UID: "cab9f9cf-83b9-4ebe-a3fe-f329e7233af2"). InnerVolumeSpecName "kube-api-access-sjkts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:44:04 crc kubenswrapper[4725]: I0227 07:44:04.906937 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjkts\" (UniqueName: \"kubernetes.io/projected/cab9f9cf-83b9-4ebe-a3fe-f329e7233af2-kube-api-access-sjkts\") on node \"crc\" DevicePath \"\"" Feb 27 07:44:04 crc kubenswrapper[4725]: I0227 07:44:04.930794 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/frr/0.log" Feb 27 07:44:05 crc kubenswrapper[4725]: I0227 07:44:05.303718 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536304-qgfcj" event={"ID":"cab9f9cf-83b9-4ebe-a3fe-f329e7233af2","Type":"ContainerDied","Data":"97fda118f35fec50b7cb591a2c1028a5ad85b179cddce909fc402168bae22d84"} Feb 27 07:44:05 crc kubenswrapper[4725]: I0227 07:44:05.304006 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97fda118f35fec50b7cb591a2c1028a5ad85b179cddce909fc402168bae22d84" Feb 27 07:44:05 crc kubenswrapper[4725]: I0227 07:44:05.304053 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536304-qgfcj" Feb 27 07:44:05 crc kubenswrapper[4725]: I0227 07:44:05.841714 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536298-tlcwr"] Feb 27 07:44:05 crc kubenswrapper[4725]: I0227 07:44:05.851928 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536298-tlcwr"] Feb 27 07:44:06 crc kubenswrapper[4725]: I0227 07:44:06.269601 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdab3cf7-f7ce-4ac1-9935-5bd85b61738c" path="/var/lib/kubelet/pods/cdab3cf7-f7ce-4ac1-9935-5bd85b61738c/volumes" Feb 27 07:44:12 crc kubenswrapper[4725]: I0227 07:44:12.753842 4725 scope.go:117] "RemoveContainer" containerID="4595fe44e9ac629797d455a904641679c2e381ecada87c2a9b2aac507d132725" Feb 27 07:44:12 crc kubenswrapper[4725]: I0227 07:44:12.791627 4725 scope.go:117] "RemoveContainer" containerID="221cbb6d8e308df80d2a125f9195097eb3ba84c9d6d61e4fd928160fb394989e" Feb 27 07:44:12 crc kubenswrapper[4725]: I0227 07:44:12.832471 4725 scope.go:117] "RemoveContainer" containerID="6250ba6ec01b8a44e14dc51d3a0e1eead618d6d39aba49fe59bee033be5a2b37" Feb 27 07:44:12 crc kubenswrapper[4725]: I0227 07:44:12.890201 4725 scope.go:117] "RemoveContainer" containerID="e0e0e7d89bf0427f35741574dffa6156ba9ac2576d7e226555870e904e516b2b" Feb 27 07:44:18 crc kubenswrapper[4725]: I0227 07:44:18.761803 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/util/0.log" Feb 27 07:44:19 crc kubenswrapper[4725]: I0227 07:44:19.512928 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/util/0.log" Feb 27 07:44:19 crc kubenswrapper[4725]: I0227 07:44:19.516038 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/pull/0.log" Feb 27 07:44:19 crc kubenswrapper[4725]: I0227 07:44:19.516192 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/pull/0.log" Feb 27 07:44:19 crc kubenswrapper[4725]: I0227 07:44:19.658720 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/util/0.log" Feb 27 07:44:19 crc kubenswrapper[4725]: I0227 07:44:19.675209 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/extract/0.log" Feb 27 07:44:19 crc kubenswrapper[4725]: I0227 07:44:19.678274 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/pull/0.log" Feb 27 07:44:19 crc kubenswrapper[4725]: I0227 07:44:19.810223 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/util/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.009627 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/util/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.016491 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/pull/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.047943 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/pull/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.210773 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/extract/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.229836 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/pull/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.235707 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/util/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.423847 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-utilities/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.575640 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-content/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.600458 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-content/0.log" Feb 27 07:44:20 crc kubenswrapper[4725]: I0227 07:44:20.606605 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-utilities/0.log" Feb 27 07:44:21 crc kubenswrapper[4725]: I0227 07:44:21.663031 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-content/0.log" Feb 27 07:44:21 crc kubenswrapper[4725]: I0227 07:44:21.670506 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-utilities/0.log" Feb 27 07:44:21 crc kubenswrapper[4725]: I0227 07:44:21.872653 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bps24_359b62ed-682f-43ae-9a58-1953f516c2d0/extract-utilities/0.log" Feb 27 07:44:21 crc kubenswrapper[4725]: I0227 07:44:21.981538 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/registry-server/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.080149 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bps24_359b62ed-682f-43ae-9a58-1953f516c2d0/extract-utilities/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.094768 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bps24_359b62ed-682f-43ae-9a58-1953f516c2d0/extract-content/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.103425 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bps24_359b62ed-682f-43ae-9a58-1953f516c2d0/extract-content/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.270939 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bps24_359b62ed-682f-43ae-9a58-1953f516c2d0/extract-utilities/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.299208 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bps24_359b62ed-682f-43ae-9a58-1953f516c2d0/extract-content/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.474232 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/util/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.697153 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/util/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.757518 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/pull/0.log" Feb 27 07:44:22 crc kubenswrapper[4725]: I0227 07:44:22.776039 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/pull/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.130412 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/util/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.158877 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/extract/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.166081 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bps24_359b62ed-682f-43ae-9a58-1953f516c2d0/registry-server/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.191009 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/pull/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.332434 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-b4kzj_edee26dc-dc59-4500-8fe6-0f9f7e9c4546/marketplace-operator/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.388591 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-utilities/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.564722 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-content/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.588657 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-utilities/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.614166 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-content/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.785420 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-content/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.841663 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-utilities/0.log" Feb 27 07:44:23 crc kubenswrapper[4725]: I0227 07:44:23.924269 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-utilities/0.log" Feb 27 07:44:24 crc kubenswrapper[4725]: I0227 07:44:24.034057 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-utilities/0.log" Feb 27 07:44:24 crc kubenswrapper[4725]: I0227 07:44:24.058321 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/registry-server/0.log" Feb 27 07:44:24 crc kubenswrapper[4725]: I0227 07:44:24.091305 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-content/0.log" Feb 27 07:44:24 crc kubenswrapper[4725]: I0227 07:44:24.117154 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-content/0.log" Feb 27 07:44:24 crc kubenswrapper[4725]: I0227 07:44:24.293259 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-utilities/0.log" Feb 27 07:44:24 crc kubenswrapper[4725]: I0227 07:44:24.341726 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-content/0.log" Feb 27 07:44:24 crc kubenswrapper[4725]: I0227 07:44:24.969030 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/registry-server/0.log" Feb 27 07:44:39 crc kubenswrapper[4725]: I0227 07:44:39.239369 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vpgt6_9fefe362-2058-4721-930e-9651059cfcc8/prometheus-operator/0.log" Feb 27 07:44:39 crc kubenswrapper[4725]: I0227 07:44:39.260172 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj_0f50c85e-bec7-4a58-9317-b86b3ba5e02c/prometheus-operator-admission-webhook/0.log" Feb 27 07:44:39 crc kubenswrapper[4725]: I0227 07:44:39.267709 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74_7e243b80-5980-459f-ba42-90ebdd42e05b/prometheus-operator-admission-webhook/0.log" Feb 27 07:44:39 crc kubenswrapper[4725]: I0227 07:44:39.461681 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9h7hk_0c2b0104-f94a-4e8a-bcd0-464ac8942f54/perses-operator/0.log" Feb 27 07:44:39 crc kubenswrapper[4725]: I0227 07:44:39.477199 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-x6b52_5810e280-be69-4236-9014-d459c65bd287/operator/0.log" Feb 27 07:44:56 crc kubenswrapper[4725]: E0227 07:44:56.907419 4725 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.192:39822->38.102.83.192:37635: write tcp 38.102.83.192:39822->38.102.83.192:37635: write: broken pipe Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.149696 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh"] Feb 27 07:45:00 crc kubenswrapper[4725]: E0227 07:45:00.150626 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab9f9cf-83b9-4ebe-a3fe-f329e7233af2" containerName="oc" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.150639 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab9f9cf-83b9-4ebe-a3fe-f329e7233af2" containerName="oc" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.150851 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab9f9cf-83b9-4ebe-a3fe-f329e7233af2" containerName="oc" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.151647 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.153694 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.153701 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.160088 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh"] Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.170162 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8845c09f-a8c0-41dd-afe8-44c96d73773e-secret-volume\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.170239 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jp7s\" (UniqueName: \"kubernetes.io/projected/8845c09f-a8c0-41dd-afe8-44c96d73773e-kube-api-access-8jp7s\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.170651 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8845c09f-a8c0-41dd-afe8-44c96d73773e-config-volume\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.272497 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8845c09f-a8c0-41dd-afe8-44c96d73773e-secret-volume\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.272577 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jp7s\" (UniqueName: \"kubernetes.io/projected/8845c09f-a8c0-41dd-afe8-44c96d73773e-kube-api-access-8jp7s\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.272674 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8845c09f-a8c0-41dd-afe8-44c96d73773e-config-volume\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.279789 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8845c09f-a8c0-41dd-afe8-44c96d73773e-config-volume\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.286948 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8845c09f-a8c0-41dd-afe8-44c96d73773e-secret-volume\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.296019 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jp7s\" (UniqueName: \"kubernetes.io/projected/8845c09f-a8c0-41dd-afe8-44c96d73773e-kube-api-access-8jp7s\") pod \"collect-profiles-29536305-q5fqh\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:00 crc kubenswrapper[4725]: I0227 07:45:00.481925 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:01 crc kubenswrapper[4725]: I0227 07:45:01.025855 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh"] Feb 27 07:45:01 crc kubenswrapper[4725]: E0227 07:45:01.904380 4725 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8845c09f_a8c0_41dd_afe8_44c96d73773e.slice/crio-8866c0a1309bc9247a38bd58fa6ba4ead91d617374577361dc955f71a8d62039.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8845c09f_a8c0_41dd_afe8_44c96d73773e.slice/crio-conmon-8866c0a1309bc9247a38bd58fa6ba4ead91d617374577361dc955f71a8d62039.scope\": RecentStats: unable to find data in memory cache]" Feb 27 07:45:01 crc kubenswrapper[4725]: I0227 07:45:01.970935 4725 generic.go:334] "Generic (PLEG): container finished" podID="8845c09f-a8c0-41dd-afe8-44c96d73773e" containerID="8866c0a1309bc9247a38bd58fa6ba4ead91d617374577361dc955f71a8d62039" exitCode=0 Feb 27 07:45:01 crc kubenswrapper[4725]: I0227 07:45:01.970992 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" event={"ID":"8845c09f-a8c0-41dd-afe8-44c96d73773e","Type":"ContainerDied","Data":"8866c0a1309bc9247a38bd58fa6ba4ead91d617374577361dc955f71a8d62039"} Feb 27 07:45:01 crc kubenswrapper[4725]: I0227 07:45:01.971021 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" event={"ID":"8845c09f-a8c0-41dd-afe8-44c96d73773e","Type":"ContainerStarted","Data":"eee93950e7092b7a3e95cf4aa0da43e3069ef87c8833232ed746bcd95451d782"} Feb 27 07:45:02 crc kubenswrapper[4725]: I0227 07:45:02.554365 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:45:02 crc kubenswrapper[4725]: I0227 07:45:02.554653 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.369501 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.460004 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jp7s\" (UniqueName: \"kubernetes.io/projected/8845c09f-a8c0-41dd-afe8-44c96d73773e-kube-api-access-8jp7s\") pod \"8845c09f-a8c0-41dd-afe8-44c96d73773e\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.460474 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8845c09f-a8c0-41dd-afe8-44c96d73773e-config-volume\") pod \"8845c09f-a8c0-41dd-afe8-44c96d73773e\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.460527 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8845c09f-a8c0-41dd-afe8-44c96d73773e-secret-volume\") pod \"8845c09f-a8c0-41dd-afe8-44c96d73773e\" (UID: \"8845c09f-a8c0-41dd-afe8-44c96d73773e\") " Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.461217 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8845c09f-a8c0-41dd-afe8-44c96d73773e-config-volume" (OuterVolumeSpecName: "config-volume") pod "8845c09f-a8c0-41dd-afe8-44c96d73773e" (UID: "8845c09f-a8c0-41dd-afe8-44c96d73773e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.466456 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8845c09f-a8c0-41dd-afe8-44c96d73773e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8845c09f-a8c0-41dd-afe8-44c96d73773e" (UID: "8845c09f-a8c0-41dd-afe8-44c96d73773e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.472804 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8845c09f-a8c0-41dd-afe8-44c96d73773e-kube-api-access-8jp7s" (OuterVolumeSpecName: "kube-api-access-8jp7s") pod "8845c09f-a8c0-41dd-afe8-44c96d73773e" (UID: "8845c09f-a8c0-41dd-afe8-44c96d73773e"). InnerVolumeSpecName "kube-api-access-8jp7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.563115 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jp7s\" (UniqueName: \"kubernetes.io/projected/8845c09f-a8c0-41dd-afe8-44c96d73773e-kube-api-access-8jp7s\") on node \"crc\" DevicePath \"\"" Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.563161 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8845c09f-a8c0-41dd-afe8-44c96d73773e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:45:03 crc kubenswrapper[4725]: I0227 07:45:03.563173 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8845c09f-a8c0-41dd-afe8-44c96d73773e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 07:45:04 crc kubenswrapper[4725]: I0227 07:45:04.001345 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" event={"ID":"8845c09f-a8c0-41dd-afe8-44c96d73773e","Type":"ContainerDied","Data":"eee93950e7092b7a3e95cf4aa0da43e3069ef87c8833232ed746bcd95451d782"} Feb 27 07:45:04 crc kubenswrapper[4725]: I0227 07:45:04.001420 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eee93950e7092b7a3e95cf4aa0da43e3069ef87c8833232ed746bcd95451d782" Feb 27 07:45:04 crc kubenswrapper[4725]: I0227 07:45:04.001462 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536305-q5fqh" Feb 27 07:45:04 crc kubenswrapper[4725]: I0227 07:45:04.469404 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc"] Feb 27 07:45:04 crc kubenswrapper[4725]: I0227 07:45:04.478723 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536260-ll9cc"] Feb 27 07:45:06 crc kubenswrapper[4725]: I0227 07:45:06.274255 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0f22c8-e06d-4fa8-af82-bc2a826cafa8" path="/var/lib/kubelet/pods/ce0f22c8-e06d-4fa8-af82-bc2a826cafa8/volumes" Feb 27 07:45:12 crc kubenswrapper[4725]: I0227 07:45:12.978566 4725 scope.go:117] "RemoveContainer" containerID="62fc08cb7eb896c9f15ce0eb285c6443dbfae34c972b4a374db616d054bd7c9a" Feb 27 07:45:32 crc kubenswrapper[4725]: I0227 07:45:32.554508 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:45:32 crc kubenswrapper[4725]: I0227 07:45:32.555122 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.168931 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536306-wzmbc"] Feb 27 07:46:00 crc kubenswrapper[4725]: E0227 07:46:00.170193 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8845c09f-a8c0-41dd-afe8-44c96d73773e" containerName="collect-profiles" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.170217 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8845c09f-a8c0-41dd-afe8-44c96d73773e" containerName="collect-profiles" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.170699 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8845c09f-a8c0-41dd-afe8-44c96d73773e" containerName="collect-profiles" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.171972 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536306-wzmbc" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.180782 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.181070 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.181228 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.191035 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536306-wzmbc"] Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.215366 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/bfff0880-8ec5-4d86-99e8-b0bac5b0b29c-kube-api-access-c6j5j\") pod \"auto-csr-approver-29536306-wzmbc\" (UID: \"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c\") " pod="openshift-infra/auto-csr-approver-29536306-wzmbc" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.317010 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/bfff0880-8ec5-4d86-99e8-b0bac5b0b29c-kube-api-access-c6j5j\") pod \"auto-csr-approver-29536306-wzmbc\" (UID: \"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c\") " pod="openshift-infra/auto-csr-approver-29536306-wzmbc" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.336571 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/bfff0880-8ec5-4d86-99e8-b0bac5b0b29c-kube-api-access-c6j5j\") pod \"auto-csr-approver-29536306-wzmbc\" (UID: \"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c\") " pod="openshift-infra/auto-csr-approver-29536306-wzmbc" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.497599 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536306-wzmbc" Feb 27 07:46:00 crc kubenswrapper[4725]: I0227 07:46:00.980807 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536306-wzmbc"] Feb 27 07:46:01 crc kubenswrapper[4725]: I0227 07:46:01.695838 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536306-wzmbc" event={"ID":"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c","Type":"ContainerStarted","Data":"32e9c9377691431dd87305edae12bd24bd6c3f724770a1ac495f8ad4fa44bf91"} Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.554569 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.555135 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.555198 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.556435 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.556494 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" gracePeriod=600 Feb 27 07:46:02 crc kubenswrapper[4725]: E0227 07:46:02.686041 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.712260 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" exitCode=0 Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.712365 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee"} Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.712410 4725 scope.go:117] "RemoveContainer" containerID="3deb9a7f7c3458bea4b2fc28fe6aed78eaa0a969dd24bc29a40c059dd7e94a96" Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.713255 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:46:02 crc kubenswrapper[4725]: E0227 07:46:02.713692 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.720101 4725 generic.go:334] "Generic (PLEG): container finished" podID="bfff0880-8ec5-4d86-99e8-b0bac5b0b29c" containerID="7813768d367494d682a7a4eaf844444787970e23a9fd964949555cccdf1a3e78" exitCode=0 Feb 27 07:46:02 crc kubenswrapper[4725]: I0227 07:46:02.720145 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536306-wzmbc" event={"ID":"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c","Type":"ContainerDied","Data":"7813768d367494d682a7a4eaf844444787970e23a9fd964949555cccdf1a3e78"} Feb 27 07:46:04 crc kubenswrapper[4725]: I0227 07:46:04.106214 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536306-wzmbc" Feb 27 07:46:04 crc kubenswrapper[4725]: I0227 07:46:04.215922 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/bfff0880-8ec5-4d86-99e8-b0bac5b0b29c-kube-api-access-c6j5j\") pod \"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c\" (UID: \"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c\") " Feb 27 07:46:04 crc kubenswrapper[4725]: I0227 07:46:04.223318 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfff0880-8ec5-4d86-99e8-b0bac5b0b29c-kube-api-access-c6j5j" (OuterVolumeSpecName: "kube-api-access-c6j5j") pod "bfff0880-8ec5-4d86-99e8-b0bac5b0b29c" (UID: "bfff0880-8ec5-4d86-99e8-b0bac5b0b29c"). InnerVolumeSpecName "kube-api-access-c6j5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:46:04 crc kubenswrapper[4725]: I0227 07:46:04.320013 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6j5j\" (UniqueName: \"kubernetes.io/projected/bfff0880-8ec5-4d86-99e8-b0bac5b0b29c-kube-api-access-c6j5j\") on node \"crc\" DevicePath \"\"" Feb 27 07:46:04 crc kubenswrapper[4725]: I0227 07:46:04.748366 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536306-wzmbc" event={"ID":"bfff0880-8ec5-4d86-99e8-b0bac5b0b29c","Type":"ContainerDied","Data":"32e9c9377691431dd87305edae12bd24bd6c3f724770a1ac495f8ad4fa44bf91"} Feb 27 07:46:04 crc kubenswrapper[4725]: I0227 07:46:04.748772 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e9c9377691431dd87305edae12bd24bd6c3f724770a1ac495f8ad4fa44bf91" Feb 27 07:46:04 crc kubenswrapper[4725]: I0227 07:46:04.748454 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536306-wzmbc" Feb 27 07:46:05 crc kubenswrapper[4725]: I0227 07:46:05.192332 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536300-9w7wc"] Feb 27 07:46:05 crc kubenswrapper[4725]: I0227 07:46:05.203639 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536300-9w7wc"] Feb 27 07:46:06 crc kubenswrapper[4725]: I0227 07:46:06.272667 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16888d09-2b96-4094-af03-7b76e668a81f" path="/var/lib/kubelet/pods/16888d09-2b96-4094-af03-7b76e668a81f/volumes" Feb 27 07:46:13 crc kubenswrapper[4725]: I0227 07:46:13.047268 4725 scope.go:117] "RemoveContainer" containerID="cfd2460c2919f48499125689e29c35a5ef4eddeb928cd7f7a299da7ef5d29c2f" Feb 27 07:46:13 crc kubenswrapper[4725]: I0227 07:46:13.097012 4725 scope.go:117] "RemoveContainer" containerID="5de1d5f8797056411d4b4d7be26b3efa056409ffb1a8b2e857f058bd1fad05f0" Feb 27 07:46:18 crc kubenswrapper[4725]: I0227 07:46:18.251965 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:46:18 crc kubenswrapper[4725]: E0227 07:46:18.252712 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:46:31 crc kubenswrapper[4725]: I0227 07:46:31.252613 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:46:31 crc kubenswrapper[4725]: E0227 07:46:31.253590 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:46:45 crc kubenswrapper[4725]: I0227 07:46:45.252236 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:46:45 crc kubenswrapper[4725]: E0227 07:46:45.253265 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:46:46 crc kubenswrapper[4725]: I0227 07:46:46.058430 4725 generic.go:334] "Generic (PLEG): container finished" podID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerID="5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569" exitCode=0 Feb 27 07:46:46 crc kubenswrapper[4725]: I0227 07:46:46.058495 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" event={"ID":"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f","Type":"ContainerDied","Data":"5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569"} Feb 27 07:46:46 crc kubenswrapper[4725]: I0227 07:46:46.059375 4725 scope.go:117] "RemoveContainer" containerID="5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569" Feb 27 07:46:46 crc kubenswrapper[4725]: I0227 07:46:46.798182 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hkvvl_must-gather-s9ngw_6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f/gather/0.log" Feb 27 07:46:55 crc kubenswrapper[4725]: I0227 07:46:55.608731 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hkvvl/must-gather-s9ngw"] Feb 27 07:46:55 crc kubenswrapper[4725]: I0227 07:46:55.609465 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerName="copy" containerID="cri-o://44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80" gracePeriod=2 Feb 27 07:46:55 crc kubenswrapper[4725]: I0227 07:46:55.619176 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hkvvl/must-gather-s9ngw"] Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.081935 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hkvvl_must-gather-s9ngw_6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f/copy/0.log" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.082624 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.170610 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hkvvl_must-gather-s9ngw_6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f/copy/0.log" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.170922 4725 generic.go:334] "Generic (PLEG): container finished" podID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerID="44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80" exitCode=143 Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.170967 4725 scope.go:117] "RemoveContainer" containerID="44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.171026 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hkvvl/must-gather-s9ngw" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.200005 4725 scope.go:117] "RemoveContainer" containerID="5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.235341 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppmsh\" (UniqueName: \"kubernetes.io/projected/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-kube-api-access-ppmsh\") pod \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.235423 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-must-gather-output\") pod \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\" (UID: \"6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f\") " Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.241758 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-kube-api-access-ppmsh" (OuterVolumeSpecName: "kube-api-access-ppmsh") pod "6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" (UID: "6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f"). InnerVolumeSpecName "kube-api-access-ppmsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.268730 4725 scope.go:117] "RemoveContainer" containerID="44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80" Feb 27 07:46:56 crc kubenswrapper[4725]: E0227 07:46:56.271848 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80\": container with ID starting with 44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80 not found: ID does not exist" containerID="44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.271895 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80"} err="failed to get container status \"44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80\": rpc error: code = NotFound desc = could not find container \"44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80\": container with ID starting with 44b30e05d0c56a13766e1b7cdc3fb57dd00abd0f14156fb3b68078e00809db80 not found: ID does not exist" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.271925 4725 scope.go:117] "RemoveContainer" containerID="5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569" Feb 27 07:46:56 crc kubenswrapper[4725]: E0227 07:46:56.275374 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569\": container with ID starting with 5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569 not found: ID does not exist" containerID="5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.275407 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569"} err="failed to get container status \"5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569\": rpc error: code = NotFound desc = could not find container \"5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569\": container with ID starting with 5769d822a5e289b15ac2068d19013e4aeefee4c16fae17d13e4c8c7b5f888569 not found: ID does not exist" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.338059 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppmsh\" (UniqueName: \"kubernetes.io/projected/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-kube-api-access-ppmsh\") on node \"crc\" DevicePath \"\"" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.468091 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" (UID: "6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:46:56 crc kubenswrapper[4725]: I0227 07:46:56.541440 4725 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 27 07:46:57 crc kubenswrapper[4725]: I0227 07:46:57.251746 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:46:57 crc kubenswrapper[4725]: E0227 07:46:57.252494 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:46:58 crc kubenswrapper[4725]: I0227 07:46:58.267440 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" path="/var/lib/kubelet/pods/6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f/volumes" Feb 27 07:47:09 crc kubenswrapper[4725]: I0227 07:47:09.252951 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:47:09 crc kubenswrapper[4725]: E0227 07:47:09.253996 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:47:13 crc kubenswrapper[4725]: I0227 07:47:13.229843 4725 scope.go:117] "RemoveContainer" containerID="088047ebfa757a009cc4af0eb9ad2e20845e2c9f874a550e03249426ae06a910" Feb 27 07:47:13 crc kubenswrapper[4725]: I0227 07:47:13.276196 4725 scope.go:117] "RemoveContainer" containerID="2e1bf9edc18a5ccac82995ab5e711c9ce1024188384d8d3e104e7f0912a2e099" Feb 27 07:47:13 crc kubenswrapper[4725]: I0227 07:47:13.321888 4725 scope.go:117] "RemoveContainer" containerID="94d011aabf4634c2ac96f9e1f10464adc6698b26cf2f2e581fd9f43ad5994cdc" Feb 27 07:47:13 crc kubenswrapper[4725]: I0227 07:47:13.362410 4725 scope.go:117] "RemoveContainer" containerID="d005b9327468b2a7dc3436d95993d49c934a180d47b7adfff1e5159758f30810" Feb 27 07:47:23 crc kubenswrapper[4725]: I0227 07:47:23.252135 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:47:23 crc kubenswrapper[4725]: E0227 07:47:23.253616 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:47:38 crc kubenswrapper[4725]: I0227 07:47:38.256398 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:47:38 crc kubenswrapper[4725]: E0227 07:47:38.257268 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:47:53 crc kubenswrapper[4725]: I0227 07:47:53.252683 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:47:53 crc kubenswrapper[4725]: E0227 07:47:53.253976 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.162880 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536308-h578z"] Feb 27 07:48:00 crc kubenswrapper[4725]: E0227 07:48:00.164400 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerName="gather" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.164417 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerName="gather" Feb 27 07:48:00 crc kubenswrapper[4725]: E0227 07:48:00.164435 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfff0880-8ec5-4d86-99e8-b0bac5b0b29c" containerName="oc" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.164444 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfff0880-8ec5-4d86-99e8-b0bac5b0b29c" containerName="oc" Feb 27 07:48:00 crc kubenswrapper[4725]: E0227 07:48:00.164472 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerName="copy" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.164479 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerName="copy" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.164720 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerName="copy" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.164751 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e14cf9f-1aa9-4c1d-82a2-98df7ac5a13f" containerName="gather" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.164767 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfff0880-8ec5-4d86-99e8-b0bac5b0b29c" containerName="oc" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.165790 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536308-h578z" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.178307 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536308-h578z"] Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.192745 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.193618 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.194123 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.306893 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkqqv\" (UniqueName: \"kubernetes.io/projected/a4477496-7340-4a34-b228-fd6cbc1609de-kube-api-access-wkqqv\") pod \"auto-csr-approver-29536308-h578z\" (UID: \"a4477496-7340-4a34-b228-fd6cbc1609de\") " pod="openshift-infra/auto-csr-approver-29536308-h578z" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.408522 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkqqv\" (UniqueName: \"kubernetes.io/projected/a4477496-7340-4a34-b228-fd6cbc1609de-kube-api-access-wkqqv\") pod \"auto-csr-approver-29536308-h578z\" (UID: \"a4477496-7340-4a34-b228-fd6cbc1609de\") " pod="openshift-infra/auto-csr-approver-29536308-h578z" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.430607 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkqqv\" (UniqueName: \"kubernetes.io/projected/a4477496-7340-4a34-b228-fd6cbc1609de-kube-api-access-wkqqv\") pod \"auto-csr-approver-29536308-h578z\" (UID: \"a4477496-7340-4a34-b228-fd6cbc1609de\") " pod="openshift-infra/auto-csr-approver-29536308-h578z" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.511655 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536308-h578z" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.697895 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7vrcw"] Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.700926 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.710966 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vrcw"] Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.817051 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-catalog-content\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.817230 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-utilities\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.817278 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns2ff\" (UniqueName: \"kubernetes.io/projected/4b93ca97-5a21-4db2-91bc-aa43f72ae588-kube-api-access-ns2ff\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.919009 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-catalog-content\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.919109 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-utilities\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.919153 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns2ff\" (UniqueName: \"kubernetes.io/projected/4b93ca97-5a21-4db2-91bc-aa43f72ae588-kube-api-access-ns2ff\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.919872 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-catalog-content\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.919978 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-utilities\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.940349 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns2ff\" (UniqueName: \"kubernetes.io/projected/4b93ca97-5a21-4db2-91bc-aa43f72ae588-kube-api-access-ns2ff\") pod \"redhat-marketplace-7vrcw\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.996348 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536308-h578z"] Feb 27 07:48:00 crc kubenswrapper[4725]: I0227 07:48:00.997256 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:48:01 crc kubenswrapper[4725]: I0227 07:48:01.034711 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:01 crc kubenswrapper[4725]: I0227 07:48:01.488922 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vrcw"] Feb 27 07:48:01 crc kubenswrapper[4725]: W0227 07:48:01.513692 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b93ca97_5a21_4db2_91bc_aa43f72ae588.slice/crio-e0631d0dd1de80f3c77f7cf504f009632caa61b856bcacd4009b9b6ca2cece69 WatchSource:0}: Error finding container e0631d0dd1de80f3c77f7cf504f009632caa61b856bcacd4009b9b6ca2cece69: Status 404 returned error can't find the container with id e0631d0dd1de80f3c77f7cf504f009632caa61b856bcacd4009b9b6ca2cece69 Feb 27 07:48:01 crc kubenswrapper[4725]: I0227 07:48:01.866945 4725 generic.go:334] "Generic (PLEG): container finished" podID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerID="1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b" exitCode=0 Feb 27 07:48:01 crc kubenswrapper[4725]: I0227 07:48:01.867223 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vrcw" event={"ID":"4b93ca97-5a21-4db2-91bc-aa43f72ae588","Type":"ContainerDied","Data":"1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b"} Feb 27 07:48:01 crc kubenswrapper[4725]: I0227 07:48:01.867363 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vrcw" event={"ID":"4b93ca97-5a21-4db2-91bc-aa43f72ae588","Type":"ContainerStarted","Data":"e0631d0dd1de80f3c77f7cf504f009632caa61b856bcacd4009b9b6ca2cece69"} Feb 27 07:48:01 crc kubenswrapper[4725]: I0227 07:48:01.870241 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536308-h578z" event={"ID":"a4477496-7340-4a34-b228-fd6cbc1609de","Type":"ContainerStarted","Data":"a3006add986c7a8efe7528f95e117dc6af1111af99d0439c422ca9acd6d41176"} Feb 27 07:48:02 crc kubenswrapper[4725]: I0227 07:48:02.889621 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vrcw" event={"ID":"4b93ca97-5a21-4db2-91bc-aa43f72ae588","Type":"ContainerStarted","Data":"cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522"} Feb 27 07:48:02 crc kubenswrapper[4725]: I0227 07:48:02.894200 4725 generic.go:334] "Generic (PLEG): container finished" podID="a4477496-7340-4a34-b228-fd6cbc1609de" containerID="60ece3ad52b4f8469c5776a116b6037721c7514351f568eb508b8e3bc3e624b9" exitCode=0 Feb 27 07:48:02 crc kubenswrapper[4725]: I0227 07:48:02.894239 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536308-h578z" event={"ID":"a4477496-7340-4a34-b228-fd6cbc1609de","Type":"ContainerDied","Data":"60ece3ad52b4f8469c5776a116b6037721c7514351f568eb508b8e3bc3e624b9"} Feb 27 07:48:03 crc kubenswrapper[4725]: I0227 07:48:03.909456 4725 generic.go:334] "Generic (PLEG): container finished" podID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerID="cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522" exitCode=0 Feb 27 07:48:03 crc kubenswrapper[4725]: I0227 07:48:03.909677 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vrcw" event={"ID":"4b93ca97-5a21-4db2-91bc-aa43f72ae588","Type":"ContainerDied","Data":"cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522"} Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.281269 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536308-h578z" Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.401654 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkqqv\" (UniqueName: \"kubernetes.io/projected/a4477496-7340-4a34-b228-fd6cbc1609de-kube-api-access-wkqqv\") pod \"a4477496-7340-4a34-b228-fd6cbc1609de\" (UID: \"a4477496-7340-4a34-b228-fd6cbc1609de\") " Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.406755 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4477496-7340-4a34-b228-fd6cbc1609de-kube-api-access-wkqqv" (OuterVolumeSpecName: "kube-api-access-wkqqv") pod "a4477496-7340-4a34-b228-fd6cbc1609de" (UID: "a4477496-7340-4a34-b228-fd6cbc1609de"). InnerVolumeSpecName "kube-api-access-wkqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.505879 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkqqv\" (UniqueName: \"kubernetes.io/projected/a4477496-7340-4a34-b228-fd6cbc1609de-kube-api-access-wkqqv\") on node \"crc\" DevicePath \"\"" Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.922951 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536308-h578z" event={"ID":"a4477496-7340-4a34-b228-fd6cbc1609de","Type":"ContainerDied","Data":"a3006add986c7a8efe7528f95e117dc6af1111af99d0439c422ca9acd6d41176"} Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.923239 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3006add986c7a8efe7528f95e117dc6af1111af99d0439c422ca9acd6d41176" Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.923050 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536308-h578z" Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.925031 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vrcw" event={"ID":"4b93ca97-5a21-4db2-91bc-aa43f72ae588","Type":"ContainerStarted","Data":"f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff"} Feb 27 07:48:04 crc kubenswrapper[4725]: I0227 07:48:04.949894 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7vrcw" podStartSLOduration=2.46891778 podStartE2EDuration="4.949875201s" podCreationTimestamp="2026-02-27 07:48:00 +0000 UTC" firstStartedPulling="2026-02-27 07:48:01.871733876 +0000 UTC m=+5860.334354445" lastFinishedPulling="2026-02-27 07:48:04.352691287 +0000 UTC m=+5862.815311866" observedRunningTime="2026-02-27 07:48:04.944501019 +0000 UTC m=+5863.407121628" watchObservedRunningTime="2026-02-27 07:48:04.949875201 +0000 UTC m=+5863.412495780" Feb 27 07:48:05 crc kubenswrapper[4725]: I0227 07:48:05.360264 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536302-c8jdf"] Feb 27 07:48:05 crc kubenswrapper[4725]: I0227 07:48:05.370150 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536302-c8jdf"] Feb 27 07:48:06 crc kubenswrapper[4725]: I0227 07:48:06.251991 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:48:06 crc kubenswrapper[4725]: E0227 07:48:06.252913 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:48:06 crc kubenswrapper[4725]: I0227 07:48:06.261857 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1782d6a-3aef-46f0-a4f6-2839fe3faead" path="/var/lib/kubelet/pods/d1782d6a-3aef-46f0-a4f6-2839fe3faead/volumes" Feb 27 07:48:11 crc kubenswrapper[4725]: I0227 07:48:11.035682 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:11 crc kubenswrapper[4725]: I0227 07:48:11.037162 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:11 crc kubenswrapper[4725]: I0227 07:48:11.084092 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:12 crc kubenswrapper[4725]: I0227 07:48:12.041140 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:12 crc kubenswrapper[4725]: I0227 07:48:12.088214 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vrcw"] Feb 27 07:48:13 crc kubenswrapper[4725]: I0227 07:48:13.457028 4725 scope.go:117] "RemoveContainer" containerID="0db8397becd726dd055bbebf913628d5a7cd8c585d9e43c5d90caca7e03a6cf2" Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.018628 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7vrcw" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="registry-server" containerID="cri-o://f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff" gracePeriod=2 Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.549919 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.554133 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-catalog-content\") pod \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.554805 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-utilities\") pod \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.555075 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns2ff\" (UniqueName: \"kubernetes.io/projected/4b93ca97-5a21-4db2-91bc-aa43f72ae588-kube-api-access-ns2ff\") pod \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\" (UID: \"4b93ca97-5a21-4db2-91bc-aa43f72ae588\") " Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.555817 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-utilities" (OuterVolumeSpecName: "utilities") pod "4b93ca97-5a21-4db2-91bc-aa43f72ae588" (UID: "4b93ca97-5a21-4db2-91bc-aa43f72ae588"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.557426 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.562869 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b93ca97-5a21-4db2-91bc-aa43f72ae588-kube-api-access-ns2ff" (OuterVolumeSpecName: "kube-api-access-ns2ff") pod "4b93ca97-5a21-4db2-91bc-aa43f72ae588" (UID: "4b93ca97-5a21-4db2-91bc-aa43f72ae588"). InnerVolumeSpecName "kube-api-access-ns2ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.614900 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b93ca97-5a21-4db2-91bc-aa43f72ae588" (UID: "4b93ca97-5a21-4db2-91bc-aa43f72ae588"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.659627 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns2ff\" (UniqueName: \"kubernetes.io/projected/4b93ca97-5a21-4db2-91bc-aa43f72ae588-kube-api-access-ns2ff\") on node \"crc\" DevicePath \"\"" Feb 27 07:48:14 crc kubenswrapper[4725]: I0227 07:48:14.659668 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b93ca97-5a21-4db2-91bc-aa43f72ae588-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.031878 4725 generic.go:334] "Generic (PLEG): container finished" podID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerID="f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff" exitCode=0 Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.031937 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vrcw" event={"ID":"4b93ca97-5a21-4db2-91bc-aa43f72ae588","Type":"ContainerDied","Data":"f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff"} Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.031968 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vrcw" event={"ID":"4b93ca97-5a21-4db2-91bc-aa43f72ae588","Type":"ContainerDied","Data":"e0631d0dd1de80f3c77f7cf504f009632caa61b856bcacd4009b9b6ca2cece69"} Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.031989 4725 scope.go:117] "RemoveContainer" containerID="f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.032508 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vrcw" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.060875 4725 scope.go:117] "RemoveContainer" containerID="cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.079543 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vrcw"] Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.089935 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vrcw"] Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.100148 4725 scope.go:117] "RemoveContainer" containerID="1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.133312 4725 scope.go:117] "RemoveContainer" containerID="f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff" Feb 27 07:48:15 crc kubenswrapper[4725]: E0227 07:48:15.133740 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff\": container with ID starting with f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff not found: ID does not exist" containerID="f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.133772 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff"} err="failed to get container status \"f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff\": rpc error: code = NotFound desc = could not find container \"f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff\": container with ID starting with f3a56aefc92655625a6c28ef9c94717231983daa23d22c8e342c06b778a9feff not found: ID does not exist" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.133792 4725 scope.go:117] "RemoveContainer" containerID="cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522" Feb 27 07:48:15 crc kubenswrapper[4725]: E0227 07:48:15.134046 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522\": container with ID starting with cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522 not found: ID does not exist" containerID="cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.134069 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522"} err="failed to get container status \"cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522\": rpc error: code = NotFound desc = could not find container \"cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522\": container with ID starting with cd71e531aae442ce55294c4996c256ac6099cc1022080f0e1bf044a58ec51522 not found: ID does not exist" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.134082 4725 scope.go:117] "RemoveContainer" containerID="1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b" Feb 27 07:48:15 crc kubenswrapper[4725]: E0227 07:48:15.134468 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b\": container with ID starting with 1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b not found: ID does not exist" containerID="1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b" Feb 27 07:48:15 crc kubenswrapper[4725]: I0227 07:48:15.134490 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b"} err="failed to get container status \"1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b\": rpc error: code = NotFound desc = could not find container \"1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b\": container with ID starting with 1daf7bb677487d3a4d54df7089a240ccd09b49986ed1520b11d80fef1230967b not found: ID does not exist" Feb 27 07:48:16 crc kubenswrapper[4725]: I0227 07:48:16.275400 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" path="/var/lib/kubelet/pods/4b93ca97-5a21-4db2-91bc-aa43f72ae588/volumes" Feb 27 07:48:19 crc kubenswrapper[4725]: I0227 07:48:19.251953 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:48:19 crc kubenswrapper[4725]: E0227 07:48:19.254687 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:48:34 crc kubenswrapper[4725]: I0227 07:48:34.252005 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:48:34 crc kubenswrapper[4725]: E0227 07:48:34.252727 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:48:48 crc kubenswrapper[4725]: I0227 07:48:48.252011 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:48:48 crc kubenswrapper[4725]: E0227 07:48:48.252844 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:48:59 crc kubenswrapper[4725]: I0227 07:48:59.251960 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:48:59 crc kubenswrapper[4725]: E0227 07:48:59.252954 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:49:14 crc kubenswrapper[4725]: I0227 07:49:14.252086 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:49:14 crc kubenswrapper[4725]: E0227 07:49:14.252919 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:49:27 crc kubenswrapper[4725]: I0227 07:49:27.251791 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:49:27 crc kubenswrapper[4725]: E0227 07:49:27.252807 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:49:41 crc kubenswrapper[4725]: I0227 07:49:41.253039 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:49:41 crc kubenswrapper[4725]: E0227 07:49:41.254155 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:49:54 crc kubenswrapper[4725]: I0227 07:49:54.252405 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:49:54 crc kubenswrapper[4725]: E0227 07:49:54.253253 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.156487 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536310-wlqrr"] Feb 27 07:50:00 crc kubenswrapper[4725]: E0227 07:50:00.157610 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="extract-content" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.157628 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="extract-content" Feb 27 07:50:00 crc kubenswrapper[4725]: E0227 07:50:00.157657 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="extract-utilities" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.157668 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="extract-utilities" Feb 27 07:50:00 crc kubenswrapper[4725]: E0227 07:50:00.157704 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="registry-server" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.157717 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="registry-server" Feb 27 07:50:00 crc kubenswrapper[4725]: E0227 07:50:00.157734 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4477496-7340-4a34-b228-fd6cbc1609de" containerName="oc" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.157745 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4477496-7340-4a34-b228-fd6cbc1609de" containerName="oc" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.158005 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4477496-7340-4a34-b228-fd6cbc1609de" containerName="oc" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.158047 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b93ca97-5a21-4db2-91bc-aa43f72ae588" containerName="registry-server" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.158922 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.161931 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.163950 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.165035 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.177515 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536310-wlqrr"] Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.286685 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrmc9\" (UniqueName: \"kubernetes.io/projected/30bbbaae-b9a2-4d51-9645-ea46e88e15a5-kube-api-access-hrmc9\") pod \"auto-csr-approver-29536310-wlqrr\" (UID: \"30bbbaae-b9a2-4d51-9645-ea46e88e15a5\") " pod="openshift-infra/auto-csr-approver-29536310-wlqrr" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.389986 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrmc9\" (UniqueName: \"kubernetes.io/projected/30bbbaae-b9a2-4d51-9645-ea46e88e15a5-kube-api-access-hrmc9\") pod \"auto-csr-approver-29536310-wlqrr\" (UID: \"30bbbaae-b9a2-4d51-9645-ea46e88e15a5\") " pod="openshift-infra/auto-csr-approver-29536310-wlqrr" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.413937 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrmc9\" (UniqueName: \"kubernetes.io/projected/30bbbaae-b9a2-4d51-9645-ea46e88e15a5-kube-api-access-hrmc9\") pod \"auto-csr-approver-29536310-wlqrr\" (UID: \"30bbbaae-b9a2-4d51-9645-ea46e88e15a5\") " pod="openshift-infra/auto-csr-approver-29536310-wlqrr" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.488353 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" Feb 27 07:50:00 crc kubenswrapper[4725]: I0227 07:50:00.932415 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536310-wlqrr"] Feb 27 07:50:00 crc kubenswrapper[4725]: W0227 07:50:00.935413 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30bbbaae_b9a2_4d51_9645_ea46e88e15a5.slice/crio-ce8e2e42ddcc580ee976a223317d39552b5affe75f6c01ca7e4adaf057045c26 WatchSource:0}: Error finding container ce8e2e42ddcc580ee976a223317d39552b5affe75f6c01ca7e4adaf057045c26: Status 404 returned error can't find the container with id ce8e2e42ddcc580ee976a223317d39552b5affe75f6c01ca7e4adaf057045c26 Feb 27 07:50:01 crc kubenswrapper[4725]: I0227 07:50:01.220389 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" event={"ID":"30bbbaae-b9a2-4d51-9645-ea46e88e15a5","Type":"ContainerStarted","Data":"ce8e2e42ddcc580ee976a223317d39552b5affe75f6c01ca7e4adaf057045c26"} Feb 27 07:50:02 crc kubenswrapper[4725]: I0227 07:50:02.230159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" event={"ID":"30bbbaae-b9a2-4d51-9645-ea46e88e15a5","Type":"ContainerStarted","Data":"1c7a34575176eb2886cde7c02c97d2f1eebd2cdcc03a34aab1baf7456de28d18"} Feb 27 07:50:02 crc kubenswrapper[4725]: I0227 07:50:02.248175 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" podStartSLOduration=1.299329717 podStartE2EDuration="2.248159416s" podCreationTimestamp="2026-02-27 07:50:00 +0000 UTC" firstStartedPulling="2026-02-27 07:50:00.940088346 +0000 UTC m=+5979.402708935" lastFinishedPulling="2026-02-27 07:50:01.888918065 +0000 UTC m=+5980.351538634" observedRunningTime="2026-02-27 07:50:02.245920403 +0000 UTC m=+5980.708541002" watchObservedRunningTime="2026-02-27 07:50:02.248159416 +0000 UTC m=+5980.710779985" Feb 27 07:50:03 crc kubenswrapper[4725]: I0227 07:50:03.243463 4725 generic.go:334] "Generic (PLEG): container finished" podID="30bbbaae-b9a2-4d51-9645-ea46e88e15a5" containerID="1c7a34575176eb2886cde7c02c97d2f1eebd2cdcc03a34aab1baf7456de28d18" exitCode=0 Feb 27 07:50:03 crc kubenswrapper[4725]: I0227 07:50:03.243568 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" event={"ID":"30bbbaae-b9a2-4d51-9645-ea46e88e15a5","Type":"ContainerDied","Data":"1c7a34575176eb2886cde7c02c97d2f1eebd2cdcc03a34aab1baf7456de28d18"} Feb 27 07:50:04 crc kubenswrapper[4725]: I0227 07:50:04.652216 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" Feb 27 07:50:04 crc kubenswrapper[4725]: I0227 07:50:04.803354 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrmc9\" (UniqueName: \"kubernetes.io/projected/30bbbaae-b9a2-4d51-9645-ea46e88e15a5-kube-api-access-hrmc9\") pod \"30bbbaae-b9a2-4d51-9645-ea46e88e15a5\" (UID: \"30bbbaae-b9a2-4d51-9645-ea46e88e15a5\") " Feb 27 07:50:04 crc kubenswrapper[4725]: I0227 07:50:04.808838 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30bbbaae-b9a2-4d51-9645-ea46e88e15a5-kube-api-access-hrmc9" (OuterVolumeSpecName: "kube-api-access-hrmc9") pod "30bbbaae-b9a2-4d51-9645-ea46e88e15a5" (UID: "30bbbaae-b9a2-4d51-9645-ea46e88e15a5"). InnerVolumeSpecName "kube-api-access-hrmc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:50:04 crc kubenswrapper[4725]: I0227 07:50:04.906745 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrmc9\" (UniqueName: \"kubernetes.io/projected/30bbbaae-b9a2-4d51-9645-ea46e88e15a5-kube-api-access-hrmc9\") on node \"crc\" DevicePath \"\"" Feb 27 07:50:05 crc kubenswrapper[4725]: I0227 07:50:05.266726 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" event={"ID":"30bbbaae-b9a2-4d51-9645-ea46e88e15a5","Type":"ContainerDied","Data":"ce8e2e42ddcc580ee976a223317d39552b5affe75f6c01ca7e4adaf057045c26"} Feb 27 07:50:05 crc kubenswrapper[4725]: I0227 07:50:05.266808 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8e2e42ddcc580ee976a223317d39552b5affe75f6c01ca7e4adaf057045c26" Feb 27 07:50:05 crc kubenswrapper[4725]: I0227 07:50:05.266907 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536310-wlqrr" Feb 27 07:50:05 crc kubenswrapper[4725]: I0227 07:50:05.345183 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536304-qgfcj"] Feb 27 07:50:05 crc kubenswrapper[4725]: I0227 07:50:05.363037 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536304-qgfcj"] Feb 27 07:50:06 crc kubenswrapper[4725]: I0227 07:50:06.251437 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:50:06 crc kubenswrapper[4725]: E0227 07:50:06.251894 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:50:06 crc kubenswrapper[4725]: I0227 07:50:06.269179 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cab9f9cf-83b9-4ebe-a3fe-f329e7233af2" path="/var/lib/kubelet/pods/cab9f9cf-83b9-4ebe-a3fe-f329e7233af2/volumes" Feb 27 07:50:13 crc kubenswrapper[4725]: I0227 07:50:13.574891 4725 scope.go:117] "RemoveContainer" containerID="d1fa97dd91f082933f57a487e9144ba9565e7aa529419ef8bcfa831b0e781002" Feb 27 07:50:17 crc kubenswrapper[4725]: I0227 07:50:17.251636 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:50:17 crc kubenswrapper[4725]: E0227 07:50:17.252417 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.741300 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgvhz/must-gather-xtbcg"] Feb 27 07:50:22 crc kubenswrapper[4725]: E0227 07:50:22.742400 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30bbbaae-b9a2-4d51-9645-ea46e88e15a5" containerName="oc" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.742419 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="30bbbaae-b9a2-4d51-9645-ea46e88e15a5" containerName="oc" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.742672 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="30bbbaae-b9a2-4d51-9645-ea46e88e15a5" containerName="oc" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.744118 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.749451 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bgvhz"/"openshift-service-ca.crt" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.755919 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bgvhz"/"kube-root-ca.crt" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.776486 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgvhz/must-gather-xtbcg"] Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.808216 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbp68\" (UniqueName: \"kubernetes.io/projected/4ae06e21-fee2-4230-82be-fbe8eb29deeb-kube-api-access-xbp68\") pod \"must-gather-xtbcg\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.808375 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ae06e21-fee2-4230-82be-fbe8eb29deeb-must-gather-output\") pod \"must-gather-xtbcg\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.910010 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbp68\" (UniqueName: \"kubernetes.io/projected/4ae06e21-fee2-4230-82be-fbe8eb29deeb-kube-api-access-xbp68\") pod \"must-gather-xtbcg\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.910124 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ae06e21-fee2-4230-82be-fbe8eb29deeb-must-gather-output\") pod \"must-gather-xtbcg\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.910636 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ae06e21-fee2-4230-82be-fbe8eb29deeb-must-gather-output\") pod \"must-gather-xtbcg\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:22 crc kubenswrapper[4725]: I0227 07:50:22.932360 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbp68\" (UniqueName: \"kubernetes.io/projected/4ae06e21-fee2-4230-82be-fbe8eb29deeb-kube-api-access-xbp68\") pod \"must-gather-xtbcg\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:23 crc kubenswrapper[4725]: I0227 07:50:23.065557 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:50:23 crc kubenswrapper[4725]: I0227 07:50:23.610423 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgvhz/must-gather-xtbcg"] Feb 27 07:50:24 crc kubenswrapper[4725]: I0227 07:50:24.518618 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" event={"ID":"4ae06e21-fee2-4230-82be-fbe8eb29deeb","Type":"ContainerStarted","Data":"0b56db439e8201055683121b7621a7c00faf43c98f969968fbae062406851d5a"} Feb 27 07:50:24 crc kubenswrapper[4725]: I0227 07:50:24.519159 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" event={"ID":"4ae06e21-fee2-4230-82be-fbe8eb29deeb","Type":"ContainerStarted","Data":"3d888a7718927cae308366e52c898ed32a8418c1619c427b93f78b858a902779"} Feb 27 07:50:24 crc kubenswrapper[4725]: I0227 07:50:24.519219 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" event={"ID":"4ae06e21-fee2-4230-82be-fbe8eb29deeb","Type":"ContainerStarted","Data":"fac60b456153e2a8b98454bedb2821bdbdbf4fc2f234848a37f436d273903187"} Feb 27 07:50:24 crc kubenswrapper[4725]: I0227 07:50:24.537653 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" podStartSLOduration=2.53762725 podStartE2EDuration="2.53762725s" podCreationTimestamp="2026-02-27 07:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 07:50:24.536669863 +0000 UTC m=+6002.999290442" watchObservedRunningTime="2026-02-27 07:50:24.53762725 +0000 UTC m=+6003.000247819" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.678441 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-fl22n"] Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.680641 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.683504 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bgvhz"/"default-dockercfg-5zmmv" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.721766 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-host\") pod \"crc-debug-fl22n\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.721815 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrxk\" (UniqueName: \"kubernetes.io/projected/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-kube-api-access-shrxk\") pod \"crc-debug-fl22n\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.823482 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-host\") pod \"crc-debug-fl22n\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.823530 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrxk\" (UniqueName: \"kubernetes.io/projected/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-kube-api-access-shrxk\") pod \"crc-debug-fl22n\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.823717 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-host\") pod \"crc-debug-fl22n\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:27 crc kubenswrapper[4725]: I0227 07:50:27.860100 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrxk\" (UniqueName: \"kubernetes.io/projected/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-kube-api-access-shrxk\") pod \"crc-debug-fl22n\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:28 crc kubenswrapper[4725]: I0227 07:50:28.008722 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:50:28 crc kubenswrapper[4725]: W0227 07:50:28.043768 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60a7b1cd_8ac5_4061_92ef_ad5e5a8b91ca.slice/crio-f86e391c0372d647d04ed54ffe7aaedbbd672234cf31cc4b959a91b551f1c73c WatchSource:0}: Error finding container f86e391c0372d647d04ed54ffe7aaedbbd672234cf31cc4b959a91b551f1c73c: Status 404 returned error can't find the container with id f86e391c0372d647d04ed54ffe7aaedbbd672234cf31cc4b959a91b551f1c73c Feb 27 07:50:28 crc kubenswrapper[4725]: I0227 07:50:28.558023 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" event={"ID":"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca","Type":"ContainerStarted","Data":"523efdee1336e5addcc274781d550f560c22d21890669c1401c2c103a9269497"} Feb 27 07:50:28 crc kubenswrapper[4725]: I0227 07:50:28.558708 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" event={"ID":"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca","Type":"ContainerStarted","Data":"f86e391c0372d647d04ed54ffe7aaedbbd672234cf31cc4b959a91b551f1c73c"} Feb 27 07:50:28 crc kubenswrapper[4725]: I0227 07:50:28.581682 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" podStartSLOduration=1.581660117 podStartE2EDuration="1.581660117s" podCreationTimestamp="2026-02-27 07:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 07:50:28.572039085 +0000 UTC m=+6007.034659694" watchObservedRunningTime="2026-02-27 07:50:28.581660117 +0000 UTC m=+6007.044280696" Feb 27 07:50:30 crc kubenswrapper[4725]: I0227 07:50:30.251977 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:50:30 crc kubenswrapper[4725]: E0227 07:50:30.252753 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:50:42 crc kubenswrapper[4725]: I0227 07:50:42.263677 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:50:42 crc kubenswrapper[4725]: E0227 07:50:42.264516 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:50:57 crc kubenswrapper[4725]: I0227 07:50:57.251637 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:50:57 crc kubenswrapper[4725]: E0227 07:50:57.252496 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:51:07 crc kubenswrapper[4725]: I0227 07:51:07.965118 4725 generic.go:334] "Generic (PLEG): container finished" podID="60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca" containerID="523efdee1336e5addcc274781d550f560c22d21890669c1401c2c103a9269497" exitCode=0 Feb 27 07:51:07 crc kubenswrapper[4725]: I0227 07:51:07.965224 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" event={"ID":"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca","Type":"ContainerDied","Data":"523efdee1336e5addcc274781d550f560c22d21890669c1401c2c103a9269497"} Feb 27 07:51:08 crc kubenswrapper[4725]: I0227 07:51:08.252551 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:51:08 crc kubenswrapper[4725]: I0227 07:51:08.988908 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"0b6a77d2e5351870ca8784a3bea682f8887057ff4350604fd14f640d1e0ac2f3"} Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.119943 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.160638 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-fl22n"] Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.170866 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-fl22n"] Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.262130 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-host\") pod \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.262250 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-host" (OuterVolumeSpecName: "host") pod "60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca" (UID: "60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.263168 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shrxk\" (UniqueName: \"kubernetes.io/projected/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-kube-api-access-shrxk\") pod \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\" (UID: \"60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca\") " Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.263706 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-host\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.278586 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-kube-api-access-shrxk" (OuterVolumeSpecName: "kube-api-access-shrxk") pod "60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca" (UID: "60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca"). InnerVolumeSpecName "kube-api-access-shrxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.365172 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shrxk\" (UniqueName: \"kubernetes.io/projected/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca-kube-api-access-shrxk\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.999190 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86e391c0372d647d04ed54ffe7aaedbbd672234cf31cc4b959a91b551f1c73c" Feb 27 07:51:09 crc kubenswrapper[4725]: I0227 07:51:09.999231 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-fl22n" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.280260 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca" path="/var/lib/kubelet/pods/60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca/volumes" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.448661 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-62mj6"] Feb 27 07:51:10 crc kubenswrapper[4725]: E0227 07:51:10.449173 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca" containerName="container-00" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.449195 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca" containerName="container-00" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.449457 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a7b1cd-8ac5-4061-92ef-ad5e5a8b91ca" containerName="container-00" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.450359 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.452535 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bgvhz"/"default-dockercfg-5zmmv" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.588682 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bcdaca5-e78c-45a4-88cf-86fc87b57979-host\") pod \"crc-debug-62mj6\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.589007 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkffp\" (UniqueName: \"kubernetes.io/projected/8bcdaca5-e78c-45a4-88cf-86fc87b57979-kube-api-access-vkffp\") pod \"crc-debug-62mj6\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.691006 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bcdaca5-e78c-45a4-88cf-86fc87b57979-host\") pod \"crc-debug-62mj6\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.691150 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bcdaca5-e78c-45a4-88cf-86fc87b57979-host\") pod \"crc-debug-62mj6\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.691170 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkffp\" (UniqueName: \"kubernetes.io/projected/8bcdaca5-e78c-45a4-88cf-86fc87b57979-kube-api-access-vkffp\") pod \"crc-debug-62mj6\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.716916 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkffp\" (UniqueName: \"kubernetes.io/projected/8bcdaca5-e78c-45a4-88cf-86fc87b57979-kube-api-access-vkffp\") pod \"crc-debug-62mj6\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: I0227 07:51:10.765075 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:10 crc kubenswrapper[4725]: W0227 07:51:10.798575 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bcdaca5_e78c_45a4_88cf_86fc87b57979.slice/crio-48fcb795f3ed4c38ba75fc0669c6d50c38ce8ec134bbca62dcc3da58bafca407 WatchSource:0}: Error finding container 48fcb795f3ed4c38ba75fc0669c6d50c38ce8ec134bbca62dcc3da58bafca407: Status 404 returned error can't find the container with id 48fcb795f3ed4c38ba75fc0669c6d50c38ce8ec134bbca62dcc3da58bafca407 Feb 27 07:51:11 crc kubenswrapper[4725]: I0227 07:51:11.008777 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-62mj6" event={"ID":"8bcdaca5-e78c-45a4-88cf-86fc87b57979","Type":"ContainerStarted","Data":"48fcb795f3ed4c38ba75fc0669c6d50c38ce8ec134bbca62dcc3da58bafca407"} Feb 27 07:51:12 crc kubenswrapper[4725]: I0227 07:51:12.019110 4725 generic.go:334] "Generic (PLEG): container finished" podID="8bcdaca5-e78c-45a4-88cf-86fc87b57979" containerID="c427277ee229ee8bf4b08ed4fc1fee3b3314b1536c3138922730181bfc302618" exitCode=0 Feb 27 07:51:12 crc kubenswrapper[4725]: I0227 07:51:12.019209 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-62mj6" event={"ID":"8bcdaca5-e78c-45a4-88cf-86fc87b57979","Type":"ContainerDied","Data":"c427277ee229ee8bf4b08ed4fc1fee3b3314b1536c3138922730181bfc302618"} Feb 27 07:51:13 crc kubenswrapper[4725]: I0227 07:51:13.134187 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:13 crc kubenswrapper[4725]: I0227 07:51:13.232275 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bcdaca5-e78c-45a4-88cf-86fc87b57979-host\") pod \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " Feb 27 07:51:13 crc kubenswrapper[4725]: I0227 07:51:13.232403 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bcdaca5-e78c-45a4-88cf-86fc87b57979-host" (OuterVolumeSpecName: "host") pod "8bcdaca5-e78c-45a4-88cf-86fc87b57979" (UID: "8bcdaca5-e78c-45a4-88cf-86fc87b57979"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 07:51:13 crc kubenswrapper[4725]: I0227 07:51:13.232980 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkffp\" (UniqueName: \"kubernetes.io/projected/8bcdaca5-e78c-45a4-88cf-86fc87b57979-kube-api-access-vkffp\") pod \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\" (UID: \"8bcdaca5-e78c-45a4-88cf-86fc87b57979\") " Feb 27 07:51:13 crc kubenswrapper[4725]: I0227 07:51:13.233761 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bcdaca5-e78c-45a4-88cf-86fc87b57979-host\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:13 crc kubenswrapper[4725]: I0227 07:51:13.248541 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcdaca5-e78c-45a4-88cf-86fc87b57979-kube-api-access-vkffp" (OuterVolumeSpecName: "kube-api-access-vkffp") pod "8bcdaca5-e78c-45a4-88cf-86fc87b57979" (UID: "8bcdaca5-e78c-45a4-88cf-86fc87b57979"). InnerVolumeSpecName "kube-api-access-vkffp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:51:13 crc kubenswrapper[4725]: I0227 07:51:13.335910 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkffp\" (UniqueName: \"kubernetes.io/projected/8bcdaca5-e78c-45a4-88cf-86fc87b57979-kube-api-access-vkffp\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:14 crc kubenswrapper[4725]: I0227 07:51:14.038539 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-62mj6" event={"ID":"8bcdaca5-e78c-45a4-88cf-86fc87b57979","Type":"ContainerDied","Data":"48fcb795f3ed4c38ba75fc0669c6d50c38ce8ec134bbca62dcc3da58bafca407"} Feb 27 07:51:14 crc kubenswrapper[4725]: I0227 07:51:14.038882 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48fcb795f3ed4c38ba75fc0669c6d50c38ce8ec134bbca62dcc3da58bafca407" Feb 27 07:51:14 crc kubenswrapper[4725]: I0227 07:51:14.038592 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-62mj6" Feb 27 07:51:14 crc kubenswrapper[4725]: I0227 07:51:14.153758 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-62mj6"] Feb 27 07:51:14 crc kubenswrapper[4725]: I0227 07:51:14.161266 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-62mj6"] Feb 27 07:51:14 crc kubenswrapper[4725]: I0227 07:51:14.273652 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcdaca5-e78c-45a4-88cf-86fc87b57979" path="/var/lib/kubelet/pods/8bcdaca5-e78c-45a4-88cf-86fc87b57979/volumes" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.311340 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-7jb5f"] Feb 27 07:51:15 crc kubenswrapper[4725]: E0227 07:51:15.311740 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcdaca5-e78c-45a4-88cf-86fc87b57979" containerName="container-00" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.311752 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcdaca5-e78c-45a4-88cf-86fc87b57979" containerName="container-00" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.311976 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bcdaca5-e78c-45a4-88cf-86fc87b57979" containerName="container-00" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.312645 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.315348 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bgvhz"/"default-dockercfg-5zmmv" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.474854 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92df8\" (UniqueName: \"kubernetes.io/projected/a6db1b28-5b00-412f-ab07-0bb6b487544f-kube-api-access-92df8\") pod \"crc-debug-7jb5f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.475208 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6db1b28-5b00-412f-ab07-0bb6b487544f-host\") pod \"crc-debug-7jb5f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.576841 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6db1b28-5b00-412f-ab07-0bb6b487544f-host\") pod \"crc-debug-7jb5f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.576955 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6db1b28-5b00-412f-ab07-0bb6b487544f-host\") pod \"crc-debug-7jb5f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.577107 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92df8\" (UniqueName: \"kubernetes.io/projected/a6db1b28-5b00-412f-ab07-0bb6b487544f-kube-api-access-92df8\") pod \"crc-debug-7jb5f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.610575 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92df8\" (UniqueName: \"kubernetes.io/projected/a6db1b28-5b00-412f-ab07-0bb6b487544f-kube-api-access-92df8\") pod \"crc-debug-7jb5f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: I0227 07:51:15.632092 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:15 crc kubenswrapper[4725]: W0227 07:51:15.656122 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6db1b28_5b00_412f_ab07_0bb6b487544f.slice/crio-239e3c56f1395a6dafa0623a132c110f67be5924e2dd20e5eabd1e115fcac96e WatchSource:0}: Error finding container 239e3c56f1395a6dafa0623a132c110f67be5924e2dd20e5eabd1e115fcac96e: Status 404 returned error can't find the container with id 239e3c56f1395a6dafa0623a132c110f67be5924e2dd20e5eabd1e115fcac96e Feb 27 07:51:16 crc kubenswrapper[4725]: I0227 07:51:16.056670 4725 generic.go:334] "Generic (PLEG): container finished" podID="a6db1b28-5b00-412f-ab07-0bb6b487544f" containerID="a899b2c3f56c12325999051dabd78a0fbb855ce28f16d610ecae057d57f7e70a" exitCode=0 Feb 27 07:51:16 crc kubenswrapper[4725]: I0227 07:51:16.057000 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" event={"ID":"a6db1b28-5b00-412f-ab07-0bb6b487544f","Type":"ContainerDied","Data":"a899b2c3f56c12325999051dabd78a0fbb855ce28f16d610ecae057d57f7e70a"} Feb 27 07:51:16 crc kubenswrapper[4725]: I0227 07:51:16.057033 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" event={"ID":"a6db1b28-5b00-412f-ab07-0bb6b487544f","Type":"ContainerStarted","Data":"239e3c56f1395a6dafa0623a132c110f67be5924e2dd20e5eabd1e115fcac96e"} Feb 27 07:51:16 crc kubenswrapper[4725]: I0227 07:51:16.091819 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-7jb5f"] Feb 27 07:51:16 crc kubenswrapper[4725]: I0227 07:51:16.101315 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgvhz/crc-debug-7jb5f"] Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.164253 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.332696 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92df8\" (UniqueName: \"kubernetes.io/projected/a6db1b28-5b00-412f-ab07-0bb6b487544f-kube-api-access-92df8\") pod \"a6db1b28-5b00-412f-ab07-0bb6b487544f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.333035 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6db1b28-5b00-412f-ab07-0bb6b487544f-host\") pod \"a6db1b28-5b00-412f-ab07-0bb6b487544f\" (UID: \"a6db1b28-5b00-412f-ab07-0bb6b487544f\") " Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.333153 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6db1b28-5b00-412f-ab07-0bb6b487544f-host" (OuterVolumeSpecName: "host") pod "a6db1b28-5b00-412f-ab07-0bb6b487544f" (UID: "a6db1b28-5b00-412f-ab07-0bb6b487544f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.333552 4725 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6db1b28-5b00-412f-ab07-0bb6b487544f-host\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.344131 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6db1b28-5b00-412f-ab07-0bb6b487544f-kube-api-access-92df8" (OuterVolumeSpecName: "kube-api-access-92df8") pod "a6db1b28-5b00-412f-ab07-0bb6b487544f" (UID: "a6db1b28-5b00-412f-ab07-0bb6b487544f"). InnerVolumeSpecName "kube-api-access-92df8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.435752 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92df8\" (UniqueName: \"kubernetes.io/projected/a6db1b28-5b00-412f-ab07-0bb6b487544f-kube-api-access-92df8\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.469205 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qm87b"] Feb 27 07:51:17 crc kubenswrapper[4725]: E0227 07:51:17.469723 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6db1b28-5b00-412f-ab07-0bb6b487544f" containerName="container-00" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.469744 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6db1b28-5b00-412f-ab07-0bb6b487544f" containerName="container-00" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.469995 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6db1b28-5b00-412f-ab07-0bb6b487544f" containerName="container-00" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.476020 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.481015 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm87b"] Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.640221 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5144bc55-8158-469c-b43b-92a491875b63-utilities\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.640299 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5144bc55-8158-469c-b43b-92a491875b63-catalog-content\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.640397 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/5144bc55-8158-469c-b43b-92a491875b63-kube-api-access-hlfvv\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.742819 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5144bc55-8158-469c-b43b-92a491875b63-utilities\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.743169 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5144bc55-8158-469c-b43b-92a491875b63-catalog-content\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.743260 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/5144bc55-8158-469c-b43b-92a491875b63-kube-api-access-hlfvv\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.743633 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5144bc55-8158-469c-b43b-92a491875b63-catalog-content\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.743651 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5144bc55-8158-469c-b43b-92a491875b63-utilities\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.762404 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfvv\" (UniqueName: \"kubernetes.io/projected/5144bc55-8158-469c-b43b-92a491875b63-kube-api-access-hlfvv\") pod \"community-operators-qm87b\" (UID: \"5144bc55-8158-469c-b43b-92a491875b63\") " pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:17 crc kubenswrapper[4725]: I0227 07:51:17.794099 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:18 crc kubenswrapper[4725]: I0227 07:51:18.079032 4725 scope.go:117] "RemoveContainer" containerID="a899b2c3f56c12325999051dabd78a0fbb855ce28f16d610ecae057d57f7e70a" Feb 27 07:51:18 crc kubenswrapper[4725]: I0227 07:51:18.079488 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/crc-debug-7jb5f" Feb 27 07:51:18 crc kubenswrapper[4725]: I0227 07:51:18.262626 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6db1b28-5b00-412f-ab07-0bb6b487544f" path="/var/lib/kubelet/pods/a6db1b28-5b00-412f-ab07-0bb6b487544f/volumes" Feb 27 07:51:18 crc kubenswrapper[4725]: I0227 07:51:18.430915 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm87b"] Feb 27 07:51:19 crc kubenswrapper[4725]: I0227 07:51:19.088374 4725 generic.go:334] "Generic (PLEG): container finished" podID="5144bc55-8158-469c-b43b-92a491875b63" containerID="84d88cf21855911a21676c335346d8fab048573e9fcee6986ef7c88c3542bcce" exitCode=0 Feb 27 07:51:19 crc kubenswrapper[4725]: I0227 07:51:19.088475 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm87b" event={"ID":"5144bc55-8158-469c-b43b-92a491875b63","Type":"ContainerDied","Data":"84d88cf21855911a21676c335346d8fab048573e9fcee6986ef7c88c3542bcce"} Feb 27 07:51:19 crc kubenswrapper[4725]: I0227 07:51:19.088845 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm87b" event={"ID":"5144bc55-8158-469c-b43b-92a491875b63","Type":"ContainerStarted","Data":"22bb8d88fa0810be2a47a4cfad568f699701261277b953185901a64c155f83b0"} Feb 27 07:51:25 crc kubenswrapper[4725]: I0227 07:51:25.150961 4725 generic.go:334] "Generic (PLEG): container finished" podID="5144bc55-8158-469c-b43b-92a491875b63" containerID="b8796f653b1576f6cf234f56b34d575a481e01d9953d04d4d43b70b544f167bc" exitCode=0 Feb 27 07:51:25 crc kubenswrapper[4725]: I0227 07:51:25.151152 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm87b" event={"ID":"5144bc55-8158-469c-b43b-92a491875b63","Type":"ContainerDied","Data":"b8796f653b1576f6cf234f56b34d575a481e01d9953d04d4d43b70b544f167bc"} Feb 27 07:51:26 crc kubenswrapper[4725]: I0227 07:51:26.162373 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qm87b" event={"ID":"5144bc55-8158-469c-b43b-92a491875b63","Type":"ContainerStarted","Data":"33abc4dea1efe678b2cfe4fa7b8476beb56df5da583001025748cec972c04c39"} Feb 27 07:51:26 crc kubenswrapper[4725]: I0227 07:51:26.181762 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qm87b" podStartSLOduration=2.741775128 podStartE2EDuration="9.181738123s" podCreationTimestamp="2026-02-27 07:51:17 +0000 UTC" firstStartedPulling="2026-02-27 07:51:19.090370951 +0000 UTC m=+6057.552991520" lastFinishedPulling="2026-02-27 07:51:25.530333946 +0000 UTC m=+6063.992954515" observedRunningTime="2026-02-27 07:51:26.178932654 +0000 UTC m=+6064.641553243" watchObservedRunningTime="2026-02-27 07:51:26.181738123 +0000 UTC m=+6064.644358692" Feb 27 07:51:27 crc kubenswrapper[4725]: I0227 07:51:27.794471 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:27 crc kubenswrapper[4725]: I0227 07:51:27.795880 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:28 crc kubenswrapper[4725]: I0227 07:51:28.880523 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qm87b" podUID="5144bc55-8158-469c-b43b-92a491875b63" containerName="registry-server" probeResult="failure" output=< Feb 27 07:51:28 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:51:28 crc kubenswrapper[4725]: > Feb 27 07:51:37 crc kubenswrapper[4725]: I0227 07:51:37.845053 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:37 crc kubenswrapper[4725]: I0227 07:51:37.896519 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qm87b" Feb 27 07:51:37 crc kubenswrapper[4725]: I0227 07:51:37.961993 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qm87b"] Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.082709 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bps24"] Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.083532 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bps24" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="registry-server" containerID="cri-o://bf5742f4fdc6915f01102e306e03fe341d4f4d29ea562f3c6ffb032aca94d50a" gracePeriod=2 Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.319341 4725 generic.go:334] "Generic (PLEG): container finished" podID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerID="bf5742f4fdc6915f01102e306e03fe341d4f4d29ea562f3c6ffb032aca94d50a" exitCode=0 Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.319790 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bps24" event={"ID":"359b62ed-682f-43ae-9a58-1953f516c2d0","Type":"ContainerDied","Data":"bf5742f4fdc6915f01102e306e03fe341d4f4d29ea562f3c6ffb032aca94d50a"} Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.570848 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bps24" Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.597649 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-catalog-content\") pod \"359b62ed-682f-43ae-9a58-1953f516c2d0\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.597742 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsgcv\" (UniqueName: \"kubernetes.io/projected/359b62ed-682f-43ae-9a58-1953f516c2d0-kube-api-access-wsgcv\") pod \"359b62ed-682f-43ae-9a58-1953f516c2d0\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.597774 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-utilities\") pod \"359b62ed-682f-43ae-9a58-1953f516c2d0\" (UID: \"359b62ed-682f-43ae-9a58-1953f516c2d0\") " Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.599827 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-utilities" (OuterVolumeSpecName: "utilities") pod "359b62ed-682f-43ae-9a58-1953f516c2d0" (UID: "359b62ed-682f-43ae-9a58-1953f516c2d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.609475 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359b62ed-682f-43ae-9a58-1953f516c2d0-kube-api-access-wsgcv" (OuterVolumeSpecName: "kube-api-access-wsgcv") pod "359b62ed-682f-43ae-9a58-1953f516c2d0" (UID: "359b62ed-682f-43ae-9a58-1953f516c2d0"). InnerVolumeSpecName "kube-api-access-wsgcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.700132 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsgcv\" (UniqueName: \"kubernetes.io/projected/359b62ed-682f-43ae-9a58-1953f516c2d0-kube-api-access-wsgcv\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.700363 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.727376 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "359b62ed-682f-43ae-9a58-1953f516c2d0" (UID: "359b62ed-682f-43ae-9a58-1953f516c2d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:51:38 crc kubenswrapper[4725]: I0227 07:51:38.802470 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359b62ed-682f-43ae-9a58-1953f516c2d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:51:39 crc kubenswrapper[4725]: I0227 07:51:39.331131 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bps24" Feb 27 07:51:39 crc kubenswrapper[4725]: I0227 07:51:39.331535 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bps24" event={"ID":"359b62ed-682f-43ae-9a58-1953f516c2d0","Type":"ContainerDied","Data":"3e5fd64b72e0c5dff22a7ba159c84a6328207604ae3d2cce39b63b8cc470d26f"} Feb 27 07:51:39 crc kubenswrapper[4725]: I0227 07:51:39.332417 4725 scope.go:117] "RemoveContainer" containerID="bf5742f4fdc6915f01102e306e03fe341d4f4d29ea562f3c6ffb032aca94d50a" Feb 27 07:51:39 crc kubenswrapper[4725]: I0227 07:51:39.376470 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bps24"] Feb 27 07:51:39 crc kubenswrapper[4725]: I0227 07:51:39.377205 4725 scope.go:117] "RemoveContainer" containerID="d87f2bb04c4ad4f08bdcd2f9539d1986a2e0d60caeeab2c7c7b1eeccae3e7681" Feb 27 07:51:39 crc kubenswrapper[4725]: I0227 07:51:39.385609 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bps24"] Feb 27 07:51:39 crc kubenswrapper[4725]: I0227 07:51:39.450918 4725 scope.go:117] "RemoveContainer" containerID="40c7d5b95381a289a13281b21d8a33761775065cccb524091f6aebfb652c531c" Feb 27 07:51:40 crc kubenswrapper[4725]: I0227 07:51:40.264194 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" path="/var/lib/kubelet/pods/359b62ed-682f-43ae-9a58-1953f516c2d0/volumes" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.595238 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v97bf"] Feb 27 07:51:53 crc kubenswrapper[4725]: E0227 07:51:53.596854 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="registry-server" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.596871 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="registry-server" Feb 27 07:51:53 crc kubenswrapper[4725]: E0227 07:51:53.596895 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="extract-utilities" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.596904 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="extract-utilities" Feb 27 07:51:53 crc kubenswrapper[4725]: E0227 07:51:53.596918 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="extract-content" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.596926 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="extract-content" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.597184 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="359b62ed-682f-43ae-9a58-1953f516c2d0" containerName="registry-server" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.598924 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.631257 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v97bf"] Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.714905 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-catalog-content\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.714974 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9l99\" (UniqueName: \"kubernetes.io/projected/488e19fe-5ef0-4faf-ab88-87eddee8d26b-kube-api-access-f9l99\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.715004 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-utilities\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.818655 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-catalog-content\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.818746 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9l99\" (UniqueName: \"kubernetes.io/projected/488e19fe-5ef0-4faf-ab88-87eddee8d26b-kube-api-access-f9l99\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.818778 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-utilities\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.819323 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-catalog-content\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.819464 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-utilities\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.848552 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9l99\" (UniqueName: \"kubernetes.io/projected/488e19fe-5ef0-4faf-ab88-87eddee8d26b-kube-api-access-f9l99\") pod \"certified-operators-v97bf\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:53 crc kubenswrapper[4725]: I0227 07:51:53.929658 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:51:54 crc kubenswrapper[4725]: I0227 07:51:54.495774 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v97bf"] Feb 27 07:51:55 crc kubenswrapper[4725]: I0227 07:51:55.494724 4725 generic.go:334] "Generic (PLEG): container finished" podID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerID="b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a" exitCode=0 Feb 27 07:51:55 crc kubenswrapper[4725]: I0227 07:51:55.495034 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v97bf" event={"ID":"488e19fe-5ef0-4faf-ab88-87eddee8d26b","Type":"ContainerDied","Data":"b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a"} Feb 27 07:51:55 crc kubenswrapper[4725]: I0227 07:51:55.495070 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v97bf" event={"ID":"488e19fe-5ef0-4faf-ab88-87eddee8d26b","Type":"ContainerStarted","Data":"d92610d024f5eed2d1de85a72fad7342d86c1c9caf35ee4bfbf45bea817769a6"} Feb 27 07:51:56 crc kubenswrapper[4725]: I0227 07:51:56.511615 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v97bf" event={"ID":"488e19fe-5ef0-4faf-ab88-87eddee8d26b","Type":"ContainerStarted","Data":"372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff"} Feb 27 07:51:57 crc kubenswrapper[4725]: I0227 07:51:57.537791 4725 generic.go:334] "Generic (PLEG): container finished" podID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerID="372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff" exitCode=0 Feb 27 07:51:57 crc kubenswrapper[4725]: I0227 07:51:57.538118 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v97bf" event={"ID":"488e19fe-5ef0-4faf-ab88-87eddee8d26b","Type":"ContainerDied","Data":"372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff"} Feb 27 07:51:58 crc kubenswrapper[4725]: I0227 07:51:58.565489 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v97bf" event={"ID":"488e19fe-5ef0-4faf-ab88-87eddee8d26b","Type":"ContainerStarted","Data":"1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd"} Feb 27 07:51:58 crc kubenswrapper[4725]: I0227 07:51:58.602986 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v97bf" podStartSLOduration=2.949090567 podStartE2EDuration="5.602966915s" podCreationTimestamp="2026-02-27 07:51:53 +0000 UTC" firstStartedPulling="2026-02-27 07:51:55.497754059 +0000 UTC m=+6093.960374658" lastFinishedPulling="2026-02-27 07:51:58.151630397 +0000 UTC m=+6096.614251006" observedRunningTime="2026-02-27 07:51:58.592424706 +0000 UTC m=+6097.055045285" watchObservedRunningTime="2026-02-27 07:51:58.602966915 +0000 UTC m=+6097.065587494" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.167192 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536312-nx4j7"] Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.169772 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.173519 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.173553 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.175393 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.191277 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536312-nx4j7"] Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.377462 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tjqm\" (UniqueName: \"kubernetes.io/projected/472f98a7-158b-4367-a5ab-0a5c7362482c-kube-api-access-8tjqm\") pod \"auto-csr-approver-29536312-nx4j7\" (UID: \"472f98a7-158b-4367-a5ab-0a5c7362482c\") " pod="openshift-infra/auto-csr-approver-29536312-nx4j7" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.480802 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tjqm\" (UniqueName: \"kubernetes.io/projected/472f98a7-158b-4367-a5ab-0a5c7362482c-kube-api-access-8tjqm\") pod \"auto-csr-approver-29536312-nx4j7\" (UID: \"472f98a7-158b-4367-a5ab-0a5c7362482c\") " pod="openshift-infra/auto-csr-approver-29536312-nx4j7" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.499090 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tjqm\" (UniqueName: \"kubernetes.io/projected/472f98a7-158b-4367-a5ab-0a5c7362482c-kube-api-access-8tjqm\") pod \"auto-csr-approver-29536312-nx4j7\" (UID: \"472f98a7-158b-4367-a5ab-0a5c7362482c\") " pod="openshift-infra/auto-csr-approver-29536312-nx4j7" Feb 27 07:52:00 crc kubenswrapper[4725]: I0227 07:52:00.797738 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" Feb 27 07:52:01 crc kubenswrapper[4725]: I0227 07:52:01.278086 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536312-nx4j7"] Feb 27 07:52:01 crc kubenswrapper[4725]: W0227 07:52:01.279814 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod472f98a7_158b_4367_a5ab_0a5c7362482c.slice/crio-45cbc7a3b406298b5d733c68d584b95109eb74ca479fc621cfe601bdcc98fdf4 WatchSource:0}: Error finding container 45cbc7a3b406298b5d733c68d584b95109eb74ca479fc621cfe601bdcc98fdf4: Status 404 returned error can't find the container with id 45cbc7a3b406298b5d733c68d584b95109eb74ca479fc621cfe601bdcc98fdf4 Feb 27 07:52:01 crc kubenswrapper[4725]: I0227 07:52:01.597440 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" event={"ID":"472f98a7-158b-4367-a5ab-0a5c7362482c","Type":"ContainerStarted","Data":"45cbc7a3b406298b5d733c68d584b95109eb74ca479fc621cfe601bdcc98fdf4"} Feb 27 07:52:02 crc kubenswrapper[4725]: I0227 07:52:02.609353 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" event={"ID":"472f98a7-158b-4367-a5ab-0a5c7362482c","Type":"ContainerStarted","Data":"665b45f72b53fbefc52c7f2fb9874bc112f2ac31a05ad0a2edee7500e9332900"} Feb 27 07:52:02 crc kubenswrapper[4725]: I0227 07:52:02.632222 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" podStartSLOduration=1.819963408 podStartE2EDuration="2.632203873s" podCreationTimestamp="2026-02-27 07:52:00 +0000 UTC" firstStartedPulling="2026-02-27 07:52:01.284371542 +0000 UTC m=+6099.746992111" lastFinishedPulling="2026-02-27 07:52:02.096612007 +0000 UTC m=+6100.559232576" observedRunningTime="2026-02-27 07:52:02.620854861 +0000 UTC m=+6101.083475450" watchObservedRunningTime="2026-02-27 07:52:02.632203873 +0000 UTC m=+6101.094824442" Feb 27 07:52:03 crc kubenswrapper[4725]: I0227 07:52:03.619084 4725 generic.go:334] "Generic (PLEG): container finished" podID="472f98a7-158b-4367-a5ab-0a5c7362482c" containerID="665b45f72b53fbefc52c7f2fb9874bc112f2ac31a05ad0a2edee7500e9332900" exitCode=0 Feb 27 07:52:03 crc kubenswrapper[4725]: I0227 07:52:03.619219 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" event={"ID":"472f98a7-158b-4367-a5ab-0a5c7362482c","Type":"ContainerDied","Data":"665b45f72b53fbefc52c7f2fb9874bc112f2ac31a05ad0a2edee7500e9332900"} Feb 27 07:52:03 crc kubenswrapper[4725]: I0227 07:52:03.930744 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:52:03 crc kubenswrapper[4725]: I0227 07:52:03.931005 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:52:03 crc kubenswrapper[4725]: I0227 07:52:03.992086 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:52:04 crc kubenswrapper[4725]: I0227 07:52:04.671055 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:52:04 crc kubenswrapper[4725]: I0227 07:52:04.720009 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v97bf"] Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.063098 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.083263 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tjqm\" (UniqueName: \"kubernetes.io/projected/472f98a7-158b-4367-a5ab-0a5c7362482c-kube-api-access-8tjqm\") pod \"472f98a7-158b-4367-a5ab-0a5c7362482c\" (UID: \"472f98a7-158b-4367-a5ab-0a5c7362482c\") " Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.091421 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472f98a7-158b-4367-a5ab-0a5c7362482c-kube-api-access-8tjqm" (OuterVolumeSpecName: "kube-api-access-8tjqm") pod "472f98a7-158b-4367-a5ab-0a5c7362482c" (UID: "472f98a7-158b-4367-a5ab-0a5c7362482c"). InnerVolumeSpecName "kube-api-access-8tjqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.186664 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tjqm\" (UniqueName: \"kubernetes.io/projected/472f98a7-158b-4367-a5ab-0a5c7362482c-kube-api-access-8tjqm\") on node \"crc\" DevicePath \"\"" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.337991 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536306-wzmbc"] Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.349035 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536306-wzmbc"] Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.501886 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7959dbc8c4-8fc74_db039076-6d42-4d4e-b0d2-479ae5a91408/barbican-api/0.log" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.638212 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" event={"ID":"472f98a7-158b-4367-a5ab-0a5c7362482c","Type":"ContainerDied","Data":"45cbc7a3b406298b5d733c68d584b95109eb74ca479fc621cfe601bdcc98fdf4"} Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.638260 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45cbc7a3b406298b5d733c68d584b95109eb74ca479fc621cfe601bdcc98fdf4" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.638260 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536312-nx4j7" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.681024 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7959dbc8c4-8fc74_db039076-6d42-4d4e-b0d2-479ae5a91408/barbican-api-log/0.log" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.740307 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64b777b644-7s9mh_d495df58-14fc-4eb9-a8f1-104b6ca6ce22/barbican-keystone-listener/0.log" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.863185 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64b777b644-7s9mh_d495df58-14fc-4eb9-a8f1-104b6ca6ce22/barbican-keystone-listener-log/0.log" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.896651 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-545fff646c-dqt5j_e4edb4e2-0feb-4075-a823-c02d954872d3/barbican-worker/0.log" Feb 27 07:52:05 crc kubenswrapper[4725]: I0227 07:52:05.968102 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-545fff646c-dqt5j_e4edb4e2-0feb-4075-a823-c02d954872d3/barbican-worker-log/0.log" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.112926 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ngb74_6bd144c7-5dac-46d4-8cab-b3a31a352974/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.262924 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfff0880-8ec5-4d86-99e8-b0bac5b0b29c" path="/var/lib/kubelet/pods/bfff0880-8ec5-4d86-99e8-b0bac5b0b29c/volumes" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.284320 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/ceilometer-central-agent/0.log" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.339847 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/proxy-httpd/0.log" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.342691 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/ceilometer-notification-agent/0.log" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.394942 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c9649343-ccb7-4cbf-a1dd-4d4c0cc3dedd/sg-core/0.log" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.560738 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_819ee261-b129-4874-8f16-5f505d7b3c01/cinder-api-log/0.log" Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.647124 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v97bf" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="registry-server" containerID="cri-o://1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd" gracePeriod=2 Feb 27 07:52:06 crc kubenswrapper[4725]: I0227 07:52:06.937460 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_1c1a66bf-70db-4738-ae7d-4fd930ec4f4d/probe/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.185645 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.251280 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9l99\" (UniqueName: \"kubernetes.io/projected/488e19fe-5ef0-4faf-ab88-87eddee8d26b-kube-api-access-f9l99\") pod \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.251450 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-utilities\") pod \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.251586 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-catalog-content\") pod \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\" (UID: \"488e19fe-5ef0-4faf-ab88-87eddee8d26b\") " Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.254437 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-utilities" (OuterVolumeSpecName: "utilities") pod "488e19fe-5ef0-4faf-ab88-87eddee8d26b" (UID: "488e19fe-5ef0-4faf-ab88-87eddee8d26b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.263555 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488e19fe-5ef0-4faf-ab88-87eddee8d26b-kube-api-access-f9l99" (OuterVolumeSpecName: "kube-api-access-f9l99") pod "488e19fe-5ef0-4faf-ab88-87eddee8d26b" (UID: "488e19fe-5ef0-4faf-ab88-87eddee8d26b"). InnerVolumeSpecName "kube-api-access-f9l99". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.320250 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "488e19fe-5ef0-4faf-ab88-87eddee8d26b" (UID: "488e19fe-5ef0-4faf-ab88-87eddee8d26b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.336116 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b71d8cfd-c55f-43fd-b7b7-90c063488103/probe/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.343462 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b71d8cfd-c55f-43fd-b7b7-90c063488103/cinder-scheduler/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.353993 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9l99\" (UniqueName: \"kubernetes.io/projected/488e19fe-5ef0-4faf-ab88-87eddee8d26b-kube-api-access-f9l99\") on node \"crc\" DevicePath \"\"" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.354206 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.354266 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488e19fe-5ef0-4faf-ab88-87eddee8d26b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.392624 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_1c1a66bf-70db-4738-ae7d-4fd930ec4f4d/cinder-backup/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.393475 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_819ee261-b129-4874-8f16-5f505d7b3c01/cinder-api/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.614445 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_ba6ad5a5-a980-46a3-8891-5448144c7885/probe/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.658954 4725 generic.go:334] "Generic (PLEG): container finished" podID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerID="1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd" exitCode=0 Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.659013 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v97bf" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.659033 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v97bf" event={"ID":"488e19fe-5ef0-4faf-ab88-87eddee8d26b","Type":"ContainerDied","Data":"1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd"} Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.659085 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v97bf" event={"ID":"488e19fe-5ef0-4faf-ab88-87eddee8d26b","Type":"ContainerDied","Data":"d92610d024f5eed2d1de85a72fad7342d86c1c9caf35ee4bfbf45bea817769a6"} Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.659105 4725 scope.go:117] "RemoveContainer" containerID="1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.703338 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v97bf"] Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.710718 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v97bf"] Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.712648 4725 scope.go:117] "RemoveContainer" containerID="372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.735960 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_ba6ad5a5-a980-46a3-8891-5448144c7885/cinder-volume/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.759473 4725 scope.go:117] "RemoveContainer" containerID="b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.799413 4725 scope.go:117] "RemoveContainer" containerID="1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd" Feb 27 07:52:07 crc kubenswrapper[4725]: E0227 07:52:07.805480 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd\": container with ID starting with 1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd not found: ID does not exist" containerID="1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.805532 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd"} err="failed to get container status \"1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd\": rpc error: code = NotFound desc = could not find container \"1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd\": container with ID starting with 1825060b4a0a24ce3008c05b0f9f75a0f63f0d5c226f241a97302ad9b62adecd not found: ID does not exist" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.805566 4725 scope.go:117] "RemoveContainer" containerID="372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff" Feb 27 07:52:07 crc kubenswrapper[4725]: E0227 07:52:07.806210 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff\": container with ID starting with 372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff not found: ID does not exist" containerID="372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.806256 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff"} err="failed to get container status \"372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff\": rpc error: code = NotFound desc = could not find container \"372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff\": container with ID starting with 372aef57f65cfaaf8f86536ed71a7f529177197068c29aa4f1db82bfcdea82ff not found: ID does not exist" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.806309 4725 scope.go:117] "RemoveContainer" containerID="b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a" Feb 27 07:52:07 crc kubenswrapper[4725]: E0227 07:52:07.806619 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a\": container with ID starting with b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a not found: ID does not exist" containerID="b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.806649 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a"} err="failed to get container status \"b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a\": rpc error: code = NotFound desc = could not find container \"b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a\": container with ID starting with b0e727e1c46b2fc1f519fe1c42e3103e059b9a66d7d34a1edbda03b089a8fe4a not found: ID does not exist" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.956227 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_be6347d4-c8d8-416d-9229-9671f6a027d4/cinder-volume/0.log" Feb 27 07:52:07 crc kubenswrapper[4725]: I0227 07:52:07.968054 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_be6347d4-c8d8-416d-9229-9671f6a027d4/probe/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.010122 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pj85v_311ba5d5-8172-405b-aead-458a7149e826/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.167273 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-8mzjn_8b00cf98-bb69-4c5e-8f34-e862f1acf329/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.221007 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-584644fbc5-9wt8c_77a07f13-4e0b-4d51-9e35-2787348e7a63/init/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.262116 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" path="/var/lib/kubelet/pods/488e19fe-5ef0-4faf-ab88-87eddee8d26b/volumes" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.390662 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-584644fbc5-9wt8c_77a07f13-4e0b-4d51-9e35-2787348e7a63/init/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.463443 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rsgs9_1fab6c47-9849-428c-96a3-96c4cac71f69/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.619642 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-584644fbc5-9wt8c_77a07f13-4e0b-4d51-9e35-2787348e7a63/dnsmasq-dns/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.731044 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4b3b10b4-8a3a-492c-97ce-9ae74040d8ae/glance-httpd/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.739125 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4b3b10b4-8a3a-492c-97ce-9ae74040d8ae/glance-log/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.950366 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41a3d13f-cb8c-42cc-aa8e-12d09fe458f1/glance-httpd/0.log" Feb 27 07:52:08 crc kubenswrapper[4725]: I0227 07:52:08.952219 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_41a3d13f-cb8c-42cc-aa8e-12d09fe458f1/glance-log/0.log" Feb 27 07:52:09 crc kubenswrapper[4725]: I0227 07:52:09.182323 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f478fcd58-cfjzp_372d4de4-ea8f-4393-af8b-1139e593ac16/horizon/0.log" Feb 27 07:52:09 crc kubenswrapper[4725]: I0227 07:52:09.272706 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-dcf9n_8732fb9d-c8a7-4cb3-acca-83301a2c03dc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:09 crc kubenswrapper[4725]: I0227 07:52:09.507867 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-64s2r_bf027694-e689-4cb8-aaf6-3e848ec2de4b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:09 crc kubenswrapper[4725]: I0227 07:52:09.741469 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29536261-jgbdj_c53bb79f-c970-4c9e-9a11-c8961e8041ce/keystone-cron/0.log" Feb 27 07:52:09 crc kubenswrapper[4725]: I0227 07:52:09.906487 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f478fcd58-cfjzp_372d4de4-ea8f-4393-af8b-1139e593ac16/horizon-log/0.log" Feb 27 07:52:09 crc kubenswrapper[4725]: I0227 07:52:09.936423 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9d1d1822-20db-4d79-9ba2-0746292596c6/kube-state-metrics/0.log" Feb 27 07:52:10 crc kubenswrapper[4725]: I0227 07:52:10.112388 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fgvzr_501e41e3-55eb-4b62-b4b5-67f594761a64/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:10 crc kubenswrapper[4725]: I0227 07:52:10.179636 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6fccbd6487-2trpv_c5d7d934-34b3-46a7-94d2-0803780d5837/keystone-api/0.log" Feb 27 07:52:10 crc kubenswrapper[4725]: I0227 07:52:10.637252 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xwlx5_b5bb9130-cfdc-481b-8e8a-c72f5562b963/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:10 crc kubenswrapper[4725]: I0227 07:52:10.694719 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b86d8c849-9kc54_bdb517d6-290d-43f7-9791-297c8dace84e/neutron-httpd/0.log" Feb 27 07:52:10 crc kubenswrapper[4725]: I0227 07:52:10.820797 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b86d8c849-9kc54_bdb517d6-290d-43f7-9791-297c8dace84e/neutron-api/0.log" Feb 27 07:52:10 crc kubenswrapper[4725]: I0227 07:52:10.857420 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_bc2bb345-ef60-4c05-8461-1821e1db5216/setup-container/0.log" Feb 27 07:52:11 crc kubenswrapper[4725]: I0227 07:52:11.089901 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_bc2bb345-ef60-4c05-8461-1821e1db5216/setup-container/0.log" Feb 27 07:52:11 crc kubenswrapper[4725]: I0227 07:52:11.104561 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_bc2bb345-ef60-4c05-8461-1821e1db5216/rabbitmq/0.log" Feb 27 07:52:11 crc kubenswrapper[4725]: I0227 07:52:11.831959 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_20427622-030f-4f0a-870e-6119d307befa/nova-cell0-conductor-conductor/0.log" Feb 27 07:52:12 crc kubenswrapper[4725]: I0227 07:52:12.087267 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_25ef4822-4a88-4b23-8c61-03d89105d848/nova-cell1-conductor-conductor/0.log" Feb 27 07:52:12 crc kubenswrapper[4725]: I0227 07:52:12.495266 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_48e1fa8a-566b-49a3-b4ca-1e3d2d39a2b1/nova-cell1-novncproxy-novncproxy/0.log" Feb 27 07:52:12 crc kubenswrapper[4725]: I0227 07:52:12.633702 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a58f39-222d-495a-9cde-272e31f1efae/nova-api-log/0.log" Feb 27 07:52:12 crc kubenswrapper[4725]: I0227 07:52:12.635116 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wjx2k_8ac7b33c-a85a-436b-b4c1-560c074fab9b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:12 crc kubenswrapper[4725]: I0227 07:52:12.953246 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7a660f84-32ef-4def-90b6-fd4a39e117dc/nova-metadata-log/0.log" Feb 27 07:52:13 crc kubenswrapper[4725]: I0227 07:52:13.159135 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a9a58f39-222d-495a-9cde-272e31f1efae/nova-api-api/0.log" Feb 27 07:52:13 crc kubenswrapper[4725]: I0227 07:52:13.393797 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76702ae7-c9e6-485b-abc9-b54e4c073ee1/mysql-bootstrap/0.log" Feb 27 07:52:13 crc kubenswrapper[4725]: I0227 07:52:13.512704 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d39eae0d-a597-445f-9134-7e2d9f5e82ff/nova-scheduler-scheduler/0.log" Feb 27 07:52:13 crc kubenswrapper[4725]: I0227 07:52:13.565458 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76702ae7-c9e6-485b-abc9-b54e4c073ee1/galera/0.log" Feb 27 07:52:13 crc kubenswrapper[4725]: I0227 07:52:13.584509 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76702ae7-c9e6-485b-abc9-b54e4c073ee1/mysql-bootstrap/0.log" Feb 27 07:52:13 crc kubenswrapper[4725]: I0227 07:52:13.699508 4725 scope.go:117] "RemoveContainer" containerID="7813768d367494d682a7a4eaf844444787970e23a9fd964949555cccdf1a3e78" Feb 27 07:52:13 crc kubenswrapper[4725]: I0227 07:52:13.810713 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7f24b5c8-baad-48b6-9242-2ad6bb6c471f/mysql-bootstrap/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.011360 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7f24b5c8-baad-48b6-9242-2ad6bb6c471f/galera/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.043941 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7f24b5c8-baad-48b6-9242-2ad6bb6c471f/mysql-bootstrap/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.213658 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6c9af008-ad8e-4eaa-b631-543a0ef1bb00/openstackclient/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.380955 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6kvbc_03406108-89c6-4681-aeba-c6874d465b62/ovn-controller/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.560611 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-s2ht5_474555a6-7d91-4881-a4c7-785ccf8185cc/openstack-network-exporter/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.687713 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovsdb-server-init/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.889615 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovsdb-server-init/0.log" Feb 27 07:52:14 crc kubenswrapper[4725]: I0227 07:52:14.918903 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovsdb-server/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.145076 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5c4sz_741a3436-861d-4cb0-925e-597423d841a9/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.248983 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zvrdk_d05458c2-f003-46ea-a38c-eda2c69b4635/ovs-vswitchd/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.305276 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7a660f84-32ef-4def-90b6-fd4a39e117dc/nova-metadata-metadata/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.327433 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_037dd431-5912-4101-9895-0a6d11e627a6/openstack-network-exporter/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.494174 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_037dd431-5912-4101-9895-0a6d11e627a6/ovn-northd/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.572862 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_67b7afed-e3d9-42c8-9604-9d9e56f1bc1d/openstack-network-exporter/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.619387 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_67b7afed-e3d9-42c8-9604-9d9e56f1bc1d/ovsdbserver-nb/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.755818 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8a3fa421-de83-44cb-8857-ef6f679f37dc/openstack-network-exporter/0.log" Feb 27 07:52:15 crc kubenswrapper[4725]: I0227 07:52:15.768162 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8a3fa421-de83-44cb-8857-ef6f679f37dc/ovsdbserver-sb/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.145895 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/init-config-reloader/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.205267 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-745fdc9fb8-jhz6h_3f553c85-a79e-4317-9140-708bda9525e2/placement-api/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.224312 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-745fdc9fb8-jhz6h_3f553c85-a79e-4317-9140-708bda9525e2/placement-log/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.344533 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/init-config-reloader/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.384430 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/config-reloader/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.423842 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/prometheus/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.478355 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_06478af8-d30f-4c96-9dbc-360abe61500b/thanos-sidecar/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.618710 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89e15a0f-61a2-4114-b1cc-385f54f886d3/setup-container/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.795218 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89e15a0f-61a2-4114-b1cc-385f54f886d3/setup-container/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.856984 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e4112b6c-11e8-4244-9a39-c7474ffd192b/setup-container/0.log" Feb 27 07:52:16 crc kubenswrapper[4725]: I0227 07:52:16.880191 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89e15a0f-61a2-4114-b1cc-385f54f886d3/rabbitmq/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.045252 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e4112b6c-11e8-4244-9a39-c7474ffd192b/setup-container/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.093128 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e4112b6c-11e8-4244-9a39-c7474ffd192b/rabbitmq/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.172425 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2kl77_9ac46847-17bf-49e5-ae76-1ea3af18c9f5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.328376 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7qtxw_3567a664-44a4-4138-82ec-f35dbffffb40/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.500727 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wld4g_a5ce3d2f-4b00-4971-a37f-3217fd19665a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.550719 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-55xwm_8c8e8aea-4c46-4fe2-844f-2c51d7662fa6/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.762251 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ctvww_22798438-191d-4ecf-ab5d-23af37e208b3/ssh-known-hosts-edpm-deployment/0.log" Feb 27 07:52:17 crc kubenswrapper[4725]: I0227 07:52:17.986032 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-559f68776c-7cj2d_39aed367-30f0-4ebd-a057-e33e50a6f748/proxy-server/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.012136 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-559f68776c-7cj2d_39aed367-30f0-4ebd-a057-e33e50a6f748/proxy-httpd/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.032448 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bq24l_40a2ae59-8725-42be-984a-739a82d476c5/swift-ring-rebalance/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.198173 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-auditor/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.297444 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-reaper/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.308999 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-replicator/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.422840 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/account-server/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.462901 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-auditor/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.528166 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-server/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.601938 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-replicator/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.672653 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/container-updater/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.706746 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-auditor/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.743378 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-expirer/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.871874 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-replicator/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.878355 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-server/0.log" Feb 27 07:52:18 crc kubenswrapper[4725]: I0227 07:52:18.920729 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/object-updater/0.log" Feb 27 07:52:19 crc kubenswrapper[4725]: I0227 07:52:19.026017 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/rsync/0.log" Feb 27 07:52:19 crc kubenswrapper[4725]: I0227 07:52:19.077340 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_872eba69-b1d2-4028-b65f-b70fa14daeb0/swift-recon-cron/0.log" Feb 27 07:52:19 crc kubenswrapper[4725]: I0227 07:52:19.283445 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4q7ph_9898977d-f2bf-4be4-9b90-82fbcc11ba8b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:19 crc kubenswrapper[4725]: I0227 07:52:19.328104 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a4ededce-4af9-418c-af09-c79e79cb044f/tempest-tests-tempest-tests-runner/0.log" Feb 27 07:52:19 crc kubenswrapper[4725]: I0227 07:52:19.545020 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-85q2p_308aa3d5-1a73-49da-98ae-a723be6a9c31/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 07:52:19 crc kubenswrapper[4725]: I0227 07:52:19.549541 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1ef86f5d-c8f3-4077-8184-4aecfa313695/test-operator-logs-container/0.log" Feb 27 07:52:19 crc kubenswrapper[4725]: I0227 07:52:19.642992 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_163ce132-3935-4648-b50f-fab5db3c17ca/memcached/0.log" Feb 27 07:52:20 crc kubenswrapper[4725]: I0227 07:52:20.361383 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_b2f1b2e7-bd25-401a-ae31-c49984f2c438/watcher-applier/0.log" Feb 27 07:52:21 crc kubenswrapper[4725]: I0227 07:52:21.020725 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8ca0e165-57b9-4dbd-a8a8-e036ba316122/watcher-api-log/0.log" Feb 27 07:52:23 crc kubenswrapper[4725]: I0227 07:52:23.066814 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_a075032f-0182-44f6-8dd4-b190bf27ed02/watcher-decision-engine/0.log" Feb 27 07:52:24 crc kubenswrapper[4725]: I0227 07:52:24.196560 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_8ca0e165-57b9-4dbd-a8a8-e036ba316122/watcher-api/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.138219 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/util/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.327148 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/pull/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.384071 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/util/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.385420 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/pull/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.503786 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/util/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.553349 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/extract/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.575447 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_56f71b24661a57c5d1903dca1e355bd5b30f833ecdfd555d690e2dcdfft45dr_9c9b70fa-9547-4fa3-b567-68ee52aa3b21/pull/0.log" Feb 27 07:52:45 crc kubenswrapper[4725]: I0227 07:52:45.982953 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-xkffm_f1fefb43-64d1-496a-be4b-042d68027526/manager/0.log" Feb 27 07:52:46 crc kubenswrapper[4725]: I0227 07:52:46.599560 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-48l7p_55b7330d-fa67-491c-9354-3ae2f377b245/manager/0.log" Feb 27 07:52:46 crc kubenswrapper[4725]: I0227 07:52:46.703456 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-pklxn_246fa0fd-dd91-4c17-9754-8ed71768660a/manager/0.log" Feb 27 07:52:46 crc kubenswrapper[4725]: I0227 07:52:46.939710 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-jjgd6_672a2ef1-a6d0-41f6-9bbf-5d157863ee48/manager/0.log" Feb 27 07:52:47 crc kubenswrapper[4725]: I0227 07:52:47.631779 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-wc9zj_9eeeac0e-6f80-4882-8d61-effa2342d69b/manager/0.log" Feb 27 07:52:47 crc kubenswrapper[4725]: I0227 07:52:47.799685 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-llwd8_119c1266-bd43-49d6-a39f-93abbf47c2be/manager/0.log" Feb 27 07:52:48 crc kubenswrapper[4725]: I0227 07:52:48.161566 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-4wg4t_1e6b09aa-e1b0-41c7-8aa0-e560de6310d5/manager/0.log" Feb 27 07:52:48 crc kubenswrapper[4725]: I0227 07:52:48.376172 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-zfhwz_76de952b-76db-47de-8891-40006493cf30/manager/0.log" Feb 27 07:52:48 crc kubenswrapper[4725]: I0227 07:52:48.581789 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-pl57b_a90d813f-86f2-49c9-b7d2-66d44db8236c/manager/0.log" Feb 27 07:52:48 crc kubenswrapper[4725]: I0227 07:52:48.685281 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-x8wnp_28697286-96cb-46ad-a4a5-acc3716aba31/manager/0.log" Feb 27 07:52:48 crc kubenswrapper[4725]: I0227 07:52:48.915797 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-4g4xh_1ec01345-1480-48b1-9d36-9dd8a9fc2ef8/manager/0.log" Feb 27 07:52:49 crc kubenswrapper[4725]: I0227 07:52:49.045532 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-pfq48_c32453ad-27be-4f95-bfc1-67878c36f13a/manager/0.log" Feb 27 07:52:49 crc kubenswrapper[4725]: I0227 07:52:49.236340 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-6d6fj_c6de99a3-3c54-4192-8cf6-fab2c5c9750b/manager/0.log" Feb 27 07:52:49 crc kubenswrapper[4725]: I0227 07:52:49.446890 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cvl9h4_5fccc629-9a1d-4920-b3e7-817e49953fc1/manager/0.log" Feb 27 07:52:49 crc kubenswrapper[4725]: I0227 07:52:49.731426 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7544f859d8-744ft_cc86b762-a7df-42aa-970c-76ebac88b004/operator/0.log" Feb 27 07:52:49 crc kubenswrapper[4725]: I0227 07:52:49.960922 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-r75bv_d6b63be0-6e6a-4e30-8648-28a0174338a4/registry-server/0.log" Feb 27 07:52:50 crc kubenswrapper[4725]: I0227 07:52:50.282703 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-vx95x_9972ea1a-4a28-4b7f-b511-9dd8dd3e0599/manager/0.log" Feb 27 07:52:50 crc kubenswrapper[4725]: I0227 07:52:50.338986 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-fgn4m_2e53de05-a35e-4ca4-9776-1492c5030554/manager/0.log" Feb 27 07:52:50 crc kubenswrapper[4725]: I0227 07:52:50.498158 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-zb72l_96664d14-2465-472a-b6c6-5589153d5ee3/operator/0.log" Feb 27 07:52:50 crc kubenswrapper[4725]: I0227 07:52:50.666411 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-m65cb_4f2efcf0-a55a-49e9-b815-5eb6d7f9b24d/manager/0.log" Feb 27 07:52:50 crc kubenswrapper[4725]: I0227 07:52:50.997837 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-jp6zc_6f5e713b-cd6d-482f-8603-4dd47d2297d8/manager/0.log" Feb 27 07:52:51 crc kubenswrapper[4725]: I0227 07:52:51.204472 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-2sqrk_a9acda6b-5c71-406c-985e-c5e026b064c8/manager/0.log" Feb 27 07:52:51 crc kubenswrapper[4725]: I0227 07:52:51.367854 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c68576fd-g8db5_59d481fc-2689-420f-b779-c7d840fac75d/manager/0.log" Feb 27 07:52:51 crc kubenswrapper[4725]: I0227 07:52:51.872180 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5cb5b7b9c5-kwj9k_31b25662-0274-4176-b3fd-4edd98517298/manager/0.log" Feb 27 07:52:57 crc kubenswrapper[4725]: I0227 07:52:57.415490 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-6sd5f_6e80b5f0-45bb-4081-808e-800527949f7e/manager/0.log" Feb 27 07:53:12 crc kubenswrapper[4725]: I0227 07:53:12.712906 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x796f_2aec175b-6e2b-4eac-a94f-771881386ffc/control-plane-machine-set-operator/0.log" Feb 27 07:53:12 crc kubenswrapper[4725]: I0227 07:53:12.908757 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-89pl9_682c856f-0661-4039-b071-e5c75267f3f1/kube-rbac-proxy/0.log" Feb 27 07:53:12 crc kubenswrapper[4725]: I0227 07:53:12.969177 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-89pl9_682c856f-0661-4039-b071-e5c75267f3f1/machine-api-operator/0.log" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.380734 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmmjr"] Feb 27 07:53:27 crc kubenswrapper[4725]: E0227 07:53:27.381649 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="extract-content" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.381663 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="extract-content" Feb 27 07:53:27 crc kubenswrapper[4725]: E0227 07:53:27.381693 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="registry-server" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.381699 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="registry-server" Feb 27 07:53:27 crc kubenswrapper[4725]: E0227 07:53:27.381720 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472f98a7-158b-4367-a5ab-0a5c7362482c" containerName="oc" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.381727 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="472f98a7-158b-4367-a5ab-0a5c7362482c" containerName="oc" Feb 27 07:53:27 crc kubenswrapper[4725]: E0227 07:53:27.381740 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="extract-utilities" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.381753 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="extract-utilities" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.381940 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="472f98a7-158b-4367-a5ab-0a5c7362482c" containerName="oc" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.381952 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="488e19fe-5ef0-4faf-ab88-87eddee8d26b" containerName="registry-server" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.383310 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.394046 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmmjr"] Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.496954 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-utilities\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.497066 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j999\" (UniqueName: \"kubernetes.io/projected/fb1604e3-546a-46dd-bcda-0840fdc50793-kube-api-access-6j999\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.497091 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-catalog-content\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.599325 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j999\" (UniqueName: \"kubernetes.io/projected/fb1604e3-546a-46dd-bcda-0840fdc50793-kube-api-access-6j999\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.599613 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-catalog-content\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.600031 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-utilities\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.600043 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-catalog-content\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.600241 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-utilities\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.632125 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j999\" (UniqueName: \"kubernetes.io/projected/fb1604e3-546a-46dd-bcda-0840fdc50793-kube-api-access-6j999\") pod \"redhat-operators-lmmjr\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:27 crc kubenswrapper[4725]: I0227 07:53:27.706746 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:28 crc kubenswrapper[4725]: I0227 07:53:28.146696 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4dbtd_1c9eb526-ea0d-4c3b-a6e8-309bee7c42f9/cert-manager-controller/0.log" Feb 27 07:53:28 crc kubenswrapper[4725]: I0227 07:53:28.353701 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmmjr"] Feb 27 07:53:28 crc kubenswrapper[4725]: I0227 07:53:28.423872 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmmjr" event={"ID":"fb1604e3-546a-46dd-bcda-0840fdc50793","Type":"ContainerStarted","Data":"6f1dcd157760ea0ebef52e1a8f24e742904294a0f57d47cfee195880e7145ff6"} Feb 27 07:53:28 crc kubenswrapper[4725]: I0227 07:53:28.506561 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ggzrl_17417e2f-0dcc-4720-8766-65a0d193ae26/cert-manager-cainjector/0.log" Feb 27 07:53:28 crc kubenswrapper[4725]: I0227 07:53:28.667646 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-frkvb_4c233d31-e0c7-4e39-9092-7df4e4b23c96/cert-manager-webhook/0.log" Feb 27 07:53:29 crc kubenswrapper[4725]: I0227 07:53:29.456225 4725 generic.go:334] "Generic (PLEG): container finished" podID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerID="afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a" exitCode=0 Feb 27 07:53:29 crc kubenswrapper[4725]: I0227 07:53:29.457325 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmmjr" event={"ID":"fb1604e3-546a-46dd-bcda-0840fdc50793","Type":"ContainerDied","Data":"afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a"} Feb 27 07:53:29 crc kubenswrapper[4725]: I0227 07:53:29.464116 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:53:30 crc kubenswrapper[4725]: I0227 07:53:30.471107 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmmjr" event={"ID":"fb1604e3-546a-46dd-bcda-0840fdc50793","Type":"ContainerStarted","Data":"de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a"} Feb 27 07:53:32 crc kubenswrapper[4725]: I0227 07:53:32.554416 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:53:32 crc kubenswrapper[4725]: I0227 07:53:32.554793 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:53:36 crc kubenswrapper[4725]: I0227 07:53:36.631568 4725 generic.go:334] "Generic (PLEG): container finished" podID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerID="de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a" exitCode=0 Feb 27 07:53:36 crc kubenswrapper[4725]: I0227 07:53:36.631824 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmmjr" event={"ID":"fb1604e3-546a-46dd-bcda-0840fdc50793","Type":"ContainerDied","Data":"de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a"} Feb 27 07:53:37 crc kubenswrapper[4725]: I0227 07:53:37.642754 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmmjr" event={"ID":"fb1604e3-546a-46dd-bcda-0840fdc50793","Type":"ContainerStarted","Data":"52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2"} Feb 27 07:53:37 crc kubenswrapper[4725]: I0227 07:53:37.667603 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmmjr" podStartSLOduration=3.102679448 podStartE2EDuration="10.667582179s" podCreationTimestamp="2026-02-27 07:53:27 +0000 UTC" firstStartedPulling="2026-02-27 07:53:29.462837198 +0000 UTC m=+6187.925457767" lastFinishedPulling="2026-02-27 07:53:37.027739929 +0000 UTC m=+6195.490360498" observedRunningTime="2026-02-27 07:53:37.656823014 +0000 UTC m=+6196.119443593" watchObservedRunningTime="2026-02-27 07:53:37.667582179 +0000 UTC m=+6196.130202758" Feb 27 07:53:37 crc kubenswrapper[4725]: I0227 07:53:37.707407 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:37 crc kubenswrapper[4725]: I0227 07:53:37.707532 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:53:38 crc kubenswrapper[4725]: I0227 07:53:38.765471 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmmjr" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="registry-server" probeResult="failure" output=< Feb 27 07:53:38 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:53:38 crc kubenswrapper[4725]: > Feb 27 07:53:42 crc kubenswrapper[4725]: I0227 07:53:42.868341 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-46jf6_450f9ae0-dd5c-4f1a-ad98-651d6cfe09ea/nmstate-console-plugin/0.log" Feb 27 07:53:43 crc kubenswrapper[4725]: I0227 07:53:43.041795 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z2kll_3f5fe9d7-0289-42a1-a991-3d0285038f72/nmstate-handler/0.log" Feb 27 07:53:43 crc kubenswrapper[4725]: I0227 07:53:43.111158 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bdf7n_73608f57-a852-439f-82b8-364a37b0e88c/kube-rbac-proxy/0.log" Feb 27 07:53:43 crc kubenswrapper[4725]: I0227 07:53:43.166117 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bdf7n_73608f57-a852-439f-82b8-364a37b0e88c/nmstate-metrics/0.log" Feb 27 07:53:43 crc kubenswrapper[4725]: I0227 07:53:43.294183 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-hgq5q_c0b93a17-8e40-4f49-94c7-cf241342c7be/nmstate-operator/0.log" Feb 27 07:53:43 crc kubenswrapper[4725]: I0227 07:53:43.406337 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-6sx27_7b2eb7dd-736f-4c16-8630-ed2a8607e094/nmstate-webhook/0.log" Feb 27 07:53:48 crc kubenswrapper[4725]: I0227 07:53:48.761920 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmmjr" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="registry-server" probeResult="failure" output=< Feb 27 07:53:48 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:53:48 crc kubenswrapper[4725]: > Feb 27 07:53:58 crc kubenswrapper[4725]: I0227 07:53:58.656188 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vpgt6_9fefe362-2058-4721-930e-9651059cfcc8/prometheus-operator/0.log" Feb 27 07:53:58 crc kubenswrapper[4725]: I0227 07:53:58.767212 4725 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmmjr" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="registry-server" probeResult="failure" output=< Feb 27 07:53:58 crc kubenswrapper[4725]: timeout: failed to connect service ":50051" within 1s Feb 27 07:53:58 crc kubenswrapper[4725]: > Feb 27 07:53:58 crc kubenswrapper[4725]: I0227 07:53:58.873003 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj_0f50c85e-bec7-4a58-9317-b86b3ba5e02c/prometheus-operator-admission-webhook/0.log" Feb 27 07:53:58 crc kubenswrapper[4725]: I0227 07:53:58.917136 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74_7e243b80-5980-459f-ba42-90ebdd42e05b/prometheus-operator-admission-webhook/0.log" Feb 27 07:53:59 crc kubenswrapper[4725]: I0227 07:53:59.122886 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-x6b52_5810e280-be69-4236-9014-d459c65bd287/operator/0.log" Feb 27 07:53:59 crc kubenswrapper[4725]: I0227 07:53:59.249359 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9h7hk_0c2b0104-f94a-4e8a-bcd0-464ac8942f54/perses-operator/0.log" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.168132 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536314-wsdjz"] Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.171162 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536314-wsdjz" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.175896 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.176153 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.176267 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.182508 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmk8h\" (UniqueName: \"kubernetes.io/projected/6d90dce9-f9eb-4200-99c0-d9f523d57587-kube-api-access-cmk8h\") pod \"auto-csr-approver-29536314-wsdjz\" (UID: \"6d90dce9-f9eb-4200-99c0-d9f523d57587\") " pod="openshift-infra/auto-csr-approver-29536314-wsdjz" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.183437 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536314-wsdjz"] Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.285184 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmk8h\" (UniqueName: \"kubernetes.io/projected/6d90dce9-f9eb-4200-99c0-d9f523d57587-kube-api-access-cmk8h\") pod \"auto-csr-approver-29536314-wsdjz\" (UID: \"6d90dce9-f9eb-4200-99c0-d9f523d57587\") " pod="openshift-infra/auto-csr-approver-29536314-wsdjz" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.302914 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmk8h\" (UniqueName: \"kubernetes.io/projected/6d90dce9-f9eb-4200-99c0-d9f523d57587-kube-api-access-cmk8h\") pod \"auto-csr-approver-29536314-wsdjz\" (UID: \"6d90dce9-f9eb-4200-99c0-d9f523d57587\") " pod="openshift-infra/auto-csr-approver-29536314-wsdjz" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.487617 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536314-wsdjz" Feb 27 07:54:00 crc kubenswrapper[4725]: I0227 07:54:00.946586 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536314-wsdjz"] Feb 27 07:54:00 crc kubenswrapper[4725]: W0227 07:54:00.948511 4725 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d90dce9_f9eb_4200_99c0_d9f523d57587.slice/crio-0fec287a37aef932801eb31d1ddad18852c869f5f0ac64abb1b4199bc5ea4ec1 WatchSource:0}: Error finding container 0fec287a37aef932801eb31d1ddad18852c869f5f0ac64abb1b4199bc5ea4ec1: Status 404 returned error can't find the container with id 0fec287a37aef932801eb31d1ddad18852c869f5f0ac64abb1b4199bc5ea4ec1 Feb 27 07:54:01 crc kubenswrapper[4725]: I0227 07:54:01.881472 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536314-wsdjz" event={"ID":"6d90dce9-f9eb-4200-99c0-d9f523d57587","Type":"ContainerStarted","Data":"0fec287a37aef932801eb31d1ddad18852c869f5f0ac64abb1b4199bc5ea4ec1"} Feb 27 07:54:02 crc kubenswrapper[4725]: I0227 07:54:02.554760 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:54:02 crc kubenswrapper[4725]: I0227 07:54:02.555547 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:54:02 crc kubenswrapper[4725]: I0227 07:54:02.890251 4725 generic.go:334] "Generic (PLEG): container finished" podID="6d90dce9-f9eb-4200-99c0-d9f523d57587" containerID="58291aa0682e0ae52da72d5740d8e9d00b64cb47de91294af3b610a861bc5af1" exitCode=0 Feb 27 07:54:02 crc kubenswrapper[4725]: I0227 07:54:02.890310 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536314-wsdjz" event={"ID":"6d90dce9-f9eb-4200-99c0-d9f523d57587","Type":"ContainerDied","Data":"58291aa0682e0ae52da72d5740d8e9d00b64cb47de91294af3b610a861bc5af1"} Feb 27 07:54:04 crc kubenswrapper[4725]: I0227 07:54:04.294706 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536314-wsdjz" Feb 27 07:54:04 crc kubenswrapper[4725]: I0227 07:54:04.472766 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmk8h\" (UniqueName: \"kubernetes.io/projected/6d90dce9-f9eb-4200-99c0-d9f523d57587-kube-api-access-cmk8h\") pod \"6d90dce9-f9eb-4200-99c0-d9f523d57587\" (UID: \"6d90dce9-f9eb-4200-99c0-d9f523d57587\") " Feb 27 07:54:04 crc kubenswrapper[4725]: I0227 07:54:04.479237 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d90dce9-f9eb-4200-99c0-d9f523d57587-kube-api-access-cmk8h" (OuterVolumeSpecName: "kube-api-access-cmk8h") pod "6d90dce9-f9eb-4200-99c0-d9f523d57587" (UID: "6d90dce9-f9eb-4200-99c0-d9f523d57587"). InnerVolumeSpecName "kube-api-access-cmk8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:54:04 crc kubenswrapper[4725]: I0227 07:54:04.575104 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmk8h\" (UniqueName: \"kubernetes.io/projected/6d90dce9-f9eb-4200-99c0-d9f523d57587-kube-api-access-cmk8h\") on node \"crc\" DevicePath \"\"" Feb 27 07:54:04 crc kubenswrapper[4725]: I0227 07:54:04.914598 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536314-wsdjz" event={"ID":"6d90dce9-f9eb-4200-99c0-d9f523d57587","Type":"ContainerDied","Data":"0fec287a37aef932801eb31d1ddad18852c869f5f0ac64abb1b4199bc5ea4ec1"} Feb 27 07:54:04 crc kubenswrapper[4725]: I0227 07:54:04.914635 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fec287a37aef932801eb31d1ddad18852c869f5f0ac64abb1b4199bc5ea4ec1" Feb 27 07:54:04 crc kubenswrapper[4725]: I0227 07:54:04.914721 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536314-wsdjz" Feb 27 07:54:05 crc kubenswrapper[4725]: I0227 07:54:05.367009 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536308-h578z"] Feb 27 07:54:05 crc kubenswrapper[4725]: I0227 07:54:05.378046 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536308-h578z"] Feb 27 07:54:06 crc kubenswrapper[4725]: I0227 07:54:06.262859 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4477496-7340-4a34-b228-fd6cbc1609de" path="/var/lib/kubelet/pods/a4477496-7340-4a34-b228-fd6cbc1609de/volumes" Feb 27 07:54:07 crc kubenswrapper[4725]: I0227 07:54:07.772720 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:54:07 crc kubenswrapper[4725]: I0227 07:54:07.832925 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:54:08 crc kubenswrapper[4725]: I0227 07:54:08.006088 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmmjr"] Feb 27 07:54:08 crc kubenswrapper[4725]: I0227 07:54:08.959396 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lmmjr" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="registry-server" containerID="cri-o://52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2" gracePeriod=2 Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.471724 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.570951 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j999\" (UniqueName: \"kubernetes.io/projected/fb1604e3-546a-46dd-bcda-0840fdc50793-kube-api-access-6j999\") pod \"fb1604e3-546a-46dd-bcda-0840fdc50793\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.571153 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-utilities\") pod \"fb1604e3-546a-46dd-bcda-0840fdc50793\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.571218 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-catalog-content\") pod \"fb1604e3-546a-46dd-bcda-0840fdc50793\" (UID: \"fb1604e3-546a-46dd-bcda-0840fdc50793\") " Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.572862 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-utilities" (OuterVolumeSpecName: "utilities") pod "fb1604e3-546a-46dd-bcda-0840fdc50793" (UID: "fb1604e3-546a-46dd-bcda-0840fdc50793"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.579318 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1604e3-546a-46dd-bcda-0840fdc50793-kube-api-access-6j999" (OuterVolumeSpecName: "kube-api-access-6j999") pod "fb1604e3-546a-46dd-bcda-0840fdc50793" (UID: "fb1604e3-546a-46dd-bcda-0840fdc50793"). InnerVolumeSpecName "kube-api-access-6j999". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.674224 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j999\" (UniqueName: \"kubernetes.io/projected/fb1604e3-546a-46dd-bcda-0840fdc50793-kube-api-access-6j999\") on node \"crc\" DevicePath \"\"" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.674799 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.731038 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb1604e3-546a-46dd-bcda-0840fdc50793" (UID: "fb1604e3-546a-46dd-bcda-0840fdc50793"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.777393 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb1604e3-546a-46dd-bcda-0840fdc50793-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.969459 4725 generic.go:334] "Generic (PLEG): container finished" podID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerID="52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2" exitCode=0 Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.969508 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmmjr" event={"ID":"fb1604e3-546a-46dd-bcda-0840fdc50793","Type":"ContainerDied","Data":"52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2"} Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.969525 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmmjr" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.969545 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmmjr" event={"ID":"fb1604e3-546a-46dd-bcda-0840fdc50793","Type":"ContainerDied","Data":"6f1dcd157760ea0ebef52e1a8f24e742904294a0f57d47cfee195880e7145ff6"} Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.969563 4725 scope.go:117] "RemoveContainer" containerID="52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2" Feb 27 07:54:09 crc kubenswrapper[4725]: I0227 07:54:09.991589 4725 scope.go:117] "RemoveContainer" containerID="de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.017459 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmmjr"] Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.028434 4725 scope.go:117] "RemoveContainer" containerID="afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.034373 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lmmjr"] Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.060114 4725 scope.go:117] "RemoveContainer" containerID="52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2" Feb 27 07:54:10 crc kubenswrapper[4725]: E0227 07:54:10.060581 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2\": container with ID starting with 52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2 not found: ID does not exist" containerID="52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.060617 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2"} err="failed to get container status \"52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2\": rpc error: code = NotFound desc = could not find container \"52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2\": container with ID starting with 52b79e0d15ebfd59d9dc2e3a93ce8b31a6bbada1178b6582842a69711e129cc2 not found: ID does not exist" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.060641 4725 scope.go:117] "RemoveContainer" containerID="de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a" Feb 27 07:54:10 crc kubenswrapper[4725]: E0227 07:54:10.061152 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a\": container with ID starting with de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a not found: ID does not exist" containerID="de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.061307 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a"} err="failed to get container status \"de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a\": rpc error: code = NotFound desc = could not find container \"de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a\": container with ID starting with de06a47ac3e8c42936f082c93ea4ce062bd0e7851a20559418812d3fb99e782a not found: ID does not exist" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.061419 4725 scope.go:117] "RemoveContainer" containerID="afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a" Feb 27 07:54:10 crc kubenswrapper[4725]: E0227 07:54:10.061867 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a\": container with ID starting with afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a not found: ID does not exist" containerID="afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.061982 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a"} err="failed to get container status \"afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a\": rpc error: code = NotFound desc = could not find container \"afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a\": container with ID starting with afd96d9a677fc6427f8d18bfbbe847f0911156634a3677ed70d1afd42f7dd82a not found: ID does not exist" Feb 27 07:54:10 crc kubenswrapper[4725]: I0227 07:54:10.283147 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" path="/var/lib/kubelet/pods/fb1604e3-546a-46dd-bcda-0840fdc50793/volumes" Feb 27 07:54:13 crc kubenswrapper[4725]: I0227 07:54:13.911809 4725 scope.go:117] "RemoveContainer" containerID="60ece3ad52b4f8469c5776a116b6037721c7514351f568eb508b8e3bc3e624b9" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.216658 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-smlh9_ba636a80-6000-456f-a447-d754b6d0acd2/kube-rbac-proxy/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.286387 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-smlh9_ba636a80-6000-456f-a447-d754b6d0acd2/controller/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.382063 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.564266 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.575194 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.587299 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.605366 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.792049 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.853057 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.868448 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:54:15 crc kubenswrapper[4725]: I0227 07:54:15.905052 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.094524 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-frr-files/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.101020 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/controller/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.103543 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-reloader/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.103592 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/cp-metrics/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.316587 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/kube-rbac-proxy-frr/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.323806 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/frr-metrics/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.340249 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/kube-rbac-proxy/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.574697 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/reloader/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.637138 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-vsfpn_21527ec2-4ffa-49d2-9866-89690a83fa42/frr-k8s-webhook-server/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.829901 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-679f87455-8frdc_675e5722-7295-4f2f-acaa-7ad289facd96/manager/0.log" Feb 27 07:54:16 crc kubenswrapper[4725]: I0227 07:54:16.996227 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59c54c548d-8fzq9_78958b18-878f-4fce-b4b0-d799ed1225ce/webhook-server/0.log" Feb 27 07:54:17 crc kubenswrapper[4725]: I0227 07:54:17.040939 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r767c_0b817ed1-7c17-4e44-a421-c43b2c06ec64/kube-rbac-proxy/0.log" Feb 27 07:54:17 crc kubenswrapper[4725]: I0227 07:54:17.748985 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r767c_0b817ed1-7c17-4e44-a421-c43b2c06ec64/speaker/0.log" Feb 27 07:54:18 crc kubenswrapper[4725]: I0227 07:54:18.438138 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lkpgf_8b46659e-d3d2-46a7-a93b-1209af0baea4/frr/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.201170 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/util/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.404368 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/pull/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.404526 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/pull/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.457325 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/util/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.601736 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/pull/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.647336 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/util/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.651236 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dnbrd_b14dab93-a732-4bc5-8ecc-cc49b374669f/extract/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.768801 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/util/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.968957 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/util/0.log" Feb 27 07:54:31 crc kubenswrapper[4725]: I0227 07:54:31.986011 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/pull/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.011812 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/pull/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.163208 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/extract/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.178521 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/util/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.178931 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08vv6tj_935bf92a-4c1e-47c1-a2e2-cd49a6db1b93/pull/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.351759 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-utilities/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.490426 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-utilities/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.511028 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-content/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.530930 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-content/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.554701 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.554759 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.554797 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.555296 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b6a77d2e5351870ca8784a3bea682f8887057ff4350604fd14f640d1e0ac2f3"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.555352 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://0b6a77d2e5351870ca8784a3bea682f8887057ff4350604fd14f640d1e0ac2f3" gracePeriod=600 Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.710993 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-content/0.log" Feb 27 07:54:32 crc kubenswrapper[4725]: I0227 07:54:32.741504 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/extract-utilities/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.005719 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm87b_5144bc55-8158-469c-b43b-92a491875b63/extract-utilities/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.146968 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sl74x_88d9ca16-8811-47ec-9f57-f3e2e41620be/registry-server/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.160238 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm87b_5144bc55-8158-469c-b43b-92a491875b63/extract-content/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.189211 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm87b_5144bc55-8158-469c-b43b-92a491875b63/extract-utilities/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.200748 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm87b_5144bc55-8158-469c-b43b-92a491875b63/extract-content/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.203911 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="0b6a77d2e5351870ca8784a3bea682f8887057ff4350604fd14f640d1e0ac2f3" exitCode=0 Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.203948 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"0b6a77d2e5351870ca8784a3bea682f8887057ff4350604fd14f640d1e0ac2f3"} Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.203973 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerStarted","Data":"7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547"} Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.203990 4725 scope.go:117] "RemoveContainer" containerID="bdd9c129d0aa80394fda32322cb8b19e688ec12c98c0367cabc03fb9f3c27aee" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.351889 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm87b_5144bc55-8158-469c-b43b-92a491875b63/extract-utilities/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.380838 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm87b_5144bc55-8158-469c-b43b-92a491875b63/extract-content/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.558950 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qm87b_5144bc55-8158-469c-b43b-92a491875b63/registry-server/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.574016 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/util/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.729688 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/pull/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.751743 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/util/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.845666 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/pull/0.log" Feb 27 07:54:33 crc kubenswrapper[4725]: I0227 07:54:33.999769 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/util/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.007476 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/pull/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.011620 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4cmzmm_7f14764c-87e1-4b38-9343-86b368d36b24/extract/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.196672 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-utilities/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.200818 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-b4kzj_edee26dc-dc59-4500-8fe6-0f9f7e9c4546/marketplace-operator/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.334868 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-utilities/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.356956 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-content/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.422070 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-content/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.591767 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-utilities/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.679020 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/extract-content/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.808744 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vq6n_a5e5d8d3-00b5-4799-afd9-d360d58aee21/registry-server/0.log" Feb 27 07:54:34 crc kubenswrapper[4725]: I0227 07:54:34.844352 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-utilities/0.log" Feb 27 07:54:35 crc kubenswrapper[4725]: I0227 07:54:35.040251 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-content/0.log" Feb 27 07:54:35 crc kubenswrapper[4725]: I0227 07:54:35.043401 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-content/0.log" Feb 27 07:54:35 crc kubenswrapper[4725]: I0227 07:54:35.054102 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-utilities/0.log" Feb 27 07:54:35 crc kubenswrapper[4725]: I0227 07:54:35.243053 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-utilities/0.log" Feb 27 07:54:35 crc kubenswrapper[4725]: I0227 07:54:35.256343 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/extract-content/0.log" Feb 27 07:54:35 crc kubenswrapper[4725]: I0227 07:54:35.993032 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dw9mm_8dc5591a-ae2b-4664-8c09-216b72be4a2e/registry-server/0.log" Feb 27 07:54:47 crc kubenswrapper[4725]: I0227 07:54:47.467105 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-vpgt6_9fefe362-2058-4721-930e-9651059cfcc8/prometheus-operator/0.log" Feb 27 07:54:47 crc kubenswrapper[4725]: I0227 07:54:47.518646 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-zxswj_0f50c85e-bec7-4a58-9317-b86b3ba5e02c/prometheus-operator-admission-webhook/0.log" Feb 27 07:54:47 crc kubenswrapper[4725]: I0227 07:54:47.534084 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-584c9c7c98-cjr74_7e243b80-5980-459f-ba42-90ebdd42e05b/prometheus-operator-admission-webhook/0.log" Feb 27 07:54:47 crc kubenswrapper[4725]: I0227 07:54:47.686081 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-x6b52_5810e280-be69-4236-9014-d459c65bd287/operator/0.log" Feb 27 07:54:47 crc kubenswrapper[4725]: I0227 07:54:47.710301 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9h7hk_0c2b0104-f94a-4e8a-bcd0-464ac8942f54/perses-operator/0.log" Feb 27 07:55:03 crc kubenswrapper[4725]: I0227 07:55:03.904241 4725 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-559f68776c-7cj2d" podUID="39aed367-30f0-4ebd-a057-e33e50a6f748" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.153798 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536316-tt8jn"] Feb 27 07:56:00 crc kubenswrapper[4725]: E0227 07:56:00.157249 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="registry-server" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.157519 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="registry-server" Feb 27 07:56:00 crc kubenswrapper[4725]: E0227 07:56:00.157730 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="extract-utilities" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.157904 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="extract-utilities" Feb 27 07:56:00 crc kubenswrapper[4725]: E0227 07:56:00.158085 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d90dce9-f9eb-4200-99c0-d9f523d57587" containerName="oc" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.158242 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d90dce9-f9eb-4200-99c0-d9f523d57587" containerName="oc" Feb 27 07:56:00 crc kubenswrapper[4725]: E0227 07:56:00.158488 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="extract-content" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.158634 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="extract-content" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.159211 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d90dce9-f9eb-4200-99c0-d9f523d57587" containerName="oc" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.159501 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1604e3-546a-46dd-bcda-0840fdc50793" containerName="registry-server" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.160828 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536316-tt8jn" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.163527 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.164273 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.164905 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.170903 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536316-tt8jn"] Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.350344 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvwx\" (UniqueName: \"kubernetes.io/projected/6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a-kube-api-access-8zvwx\") pod \"auto-csr-approver-29536316-tt8jn\" (UID: \"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a\") " pod="openshift-infra/auto-csr-approver-29536316-tt8jn" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.453228 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvwx\" (UniqueName: \"kubernetes.io/projected/6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a-kube-api-access-8zvwx\") pod \"auto-csr-approver-29536316-tt8jn\" (UID: \"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a\") " pod="openshift-infra/auto-csr-approver-29536316-tt8jn" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.484754 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvwx\" (UniqueName: \"kubernetes.io/projected/6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a-kube-api-access-8zvwx\") pod \"auto-csr-approver-29536316-tt8jn\" (UID: \"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a\") " pod="openshift-infra/auto-csr-approver-29536316-tt8jn" Feb 27 07:56:00 crc kubenswrapper[4725]: I0227 07:56:00.524922 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536316-tt8jn" Feb 27 07:56:01 crc kubenswrapper[4725]: I0227 07:56:01.018695 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536316-tt8jn"] Feb 27 07:56:01 crc kubenswrapper[4725]: I0227 07:56:01.236609 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536316-tt8jn" event={"ID":"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a","Type":"ContainerStarted","Data":"50fc4dc037abd9ed7dc33db06387894ebcd315100d4acf1d41a365712458045a"} Feb 27 07:56:03 crc kubenswrapper[4725]: I0227 07:56:03.263387 4725 generic.go:334] "Generic (PLEG): container finished" podID="6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a" containerID="ad3a16937dab0764b71a56b9c4c24d985618a8cb94e84e94d29d5e102abec2b8" exitCode=0 Feb 27 07:56:03 crc kubenswrapper[4725]: I0227 07:56:03.263920 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536316-tt8jn" event={"ID":"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a","Type":"ContainerDied","Data":"ad3a16937dab0764b71a56b9c4c24d985618a8cb94e84e94d29d5e102abec2b8"} Feb 27 07:56:04 crc kubenswrapper[4725]: I0227 07:56:04.673224 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536316-tt8jn" Feb 27 07:56:04 crc kubenswrapper[4725]: I0227 07:56:04.850990 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zvwx\" (UniqueName: \"kubernetes.io/projected/6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a-kube-api-access-8zvwx\") pod \"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a\" (UID: \"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a\") " Feb 27 07:56:04 crc kubenswrapper[4725]: I0227 07:56:04.861593 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a-kube-api-access-8zvwx" (OuterVolumeSpecName: "kube-api-access-8zvwx") pod "6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a" (UID: "6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a"). InnerVolumeSpecName "kube-api-access-8zvwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:56:04 crc kubenswrapper[4725]: I0227 07:56:04.953993 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zvwx\" (UniqueName: \"kubernetes.io/projected/6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a-kube-api-access-8zvwx\") on node \"crc\" DevicePath \"\"" Feb 27 07:56:05 crc kubenswrapper[4725]: I0227 07:56:05.287611 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536316-tt8jn" event={"ID":"6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a","Type":"ContainerDied","Data":"50fc4dc037abd9ed7dc33db06387894ebcd315100d4acf1d41a365712458045a"} Feb 27 07:56:05 crc kubenswrapper[4725]: I0227 07:56:05.287715 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50fc4dc037abd9ed7dc33db06387894ebcd315100d4acf1d41a365712458045a" Feb 27 07:56:05 crc kubenswrapper[4725]: I0227 07:56:05.287747 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536316-tt8jn" Feb 27 07:56:05 crc kubenswrapper[4725]: I0227 07:56:05.759620 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536310-wlqrr"] Feb 27 07:56:05 crc kubenswrapper[4725]: I0227 07:56:05.772340 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536310-wlqrr"] Feb 27 07:56:06 crc kubenswrapper[4725]: I0227 07:56:06.292824 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30bbbaae-b9a2-4d51-9645-ea46e88e15a5" path="/var/lib/kubelet/pods/30bbbaae-b9a2-4d51-9645-ea46e88e15a5/volumes" Feb 27 07:56:14 crc kubenswrapper[4725]: I0227 07:56:14.014161 4725 scope.go:117] "RemoveContainer" containerID="1c7a34575176eb2886cde7c02c97d2f1eebd2cdcc03a34aab1baf7456de28d18" Feb 27 07:56:32 crc kubenswrapper[4725]: I0227 07:56:32.554328 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:56:32 crc kubenswrapper[4725]: I0227 07:56:32.554866 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:56:52 crc kubenswrapper[4725]: I0227 07:56:52.868748 4725 generic.go:334] "Generic (PLEG): container finished" podID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerID="3d888a7718927cae308366e52c898ed32a8418c1619c427b93f78b858a902779" exitCode=0 Feb 27 07:56:52 crc kubenswrapper[4725]: I0227 07:56:52.868841 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" event={"ID":"4ae06e21-fee2-4230-82be-fbe8eb29deeb","Type":"ContainerDied","Data":"3d888a7718927cae308366e52c898ed32a8418c1619c427b93f78b858a902779"} Feb 27 07:56:52 crc kubenswrapper[4725]: I0227 07:56:52.870106 4725 scope.go:117] "RemoveContainer" containerID="3d888a7718927cae308366e52c898ed32a8418c1619c427b93f78b858a902779" Feb 27 07:56:53 crc kubenswrapper[4725]: I0227 07:56:53.767064 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgvhz_must-gather-xtbcg_4ae06e21-fee2-4230-82be-fbe8eb29deeb/gather/0.log" Feb 27 07:57:02 crc kubenswrapper[4725]: I0227 07:57:02.554808 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:57:02 crc kubenswrapper[4725]: I0227 07:57:02.555391 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:57:06 crc kubenswrapper[4725]: I0227 07:57:06.618984 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgvhz/must-gather-xtbcg"] Feb 27 07:57:06 crc kubenswrapper[4725]: I0227 07:57:06.619708 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerName="copy" containerID="cri-o://0b56db439e8201055683121b7621a7c00faf43c98f969968fbae062406851d5a" gracePeriod=2 Feb 27 07:57:06 crc kubenswrapper[4725]: I0227 07:57:06.636223 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgvhz/must-gather-xtbcg"] Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.012632 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgvhz_must-gather-xtbcg_4ae06e21-fee2-4230-82be-fbe8eb29deeb/copy/0.log" Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.013224 4725 generic.go:334] "Generic (PLEG): container finished" podID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerID="0b56db439e8201055683121b7621a7c00faf43c98f969968fbae062406851d5a" exitCode=143 Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.113351 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgvhz_must-gather-xtbcg_4ae06e21-fee2-4230-82be-fbe8eb29deeb/copy/0.log" Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.113765 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.226502 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbp68\" (UniqueName: \"kubernetes.io/projected/4ae06e21-fee2-4230-82be-fbe8eb29deeb-kube-api-access-xbp68\") pod \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.226568 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ae06e21-fee2-4230-82be-fbe8eb29deeb-must-gather-output\") pod \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\" (UID: \"4ae06e21-fee2-4230-82be-fbe8eb29deeb\") " Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.231886 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae06e21-fee2-4230-82be-fbe8eb29deeb-kube-api-access-xbp68" (OuterVolumeSpecName: "kube-api-access-xbp68") pod "4ae06e21-fee2-4230-82be-fbe8eb29deeb" (UID: "4ae06e21-fee2-4230-82be-fbe8eb29deeb"). InnerVolumeSpecName "kube-api-access-xbp68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.329040 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbp68\" (UniqueName: \"kubernetes.io/projected/4ae06e21-fee2-4230-82be-fbe8eb29deeb-kube-api-access-xbp68\") on node \"crc\" DevicePath \"\"" Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.445933 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae06e21-fee2-4230-82be-fbe8eb29deeb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4ae06e21-fee2-4230-82be-fbe8eb29deeb" (UID: "4ae06e21-fee2-4230-82be-fbe8eb29deeb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:57:07 crc kubenswrapper[4725]: I0227 07:57:07.533487 4725 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ae06e21-fee2-4230-82be-fbe8eb29deeb-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 27 07:57:08 crc kubenswrapper[4725]: I0227 07:57:08.025604 4725 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgvhz_must-gather-xtbcg_4ae06e21-fee2-4230-82be-fbe8eb29deeb/copy/0.log" Feb 27 07:57:08 crc kubenswrapper[4725]: I0227 07:57:08.026071 4725 scope.go:117] "RemoveContainer" containerID="0b56db439e8201055683121b7621a7c00faf43c98f969968fbae062406851d5a" Feb 27 07:57:08 crc kubenswrapper[4725]: I0227 07:57:08.026243 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgvhz/must-gather-xtbcg" Feb 27 07:57:08 crc kubenswrapper[4725]: I0227 07:57:08.062595 4725 scope.go:117] "RemoveContainer" containerID="3d888a7718927cae308366e52c898ed32a8418c1619c427b93f78b858a902779" Feb 27 07:57:08 crc kubenswrapper[4725]: I0227 07:57:08.263360 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" path="/var/lib/kubelet/pods/4ae06e21-fee2-4230-82be-fbe8eb29deeb/volumes" Feb 27 07:57:14 crc kubenswrapper[4725]: I0227 07:57:14.097454 4725 scope.go:117] "RemoveContainer" containerID="523efdee1336e5addcc274781d550f560c22d21890669c1401c2c103a9269497" Feb 27 07:57:14 crc kubenswrapper[4725]: I0227 07:57:14.160066 4725 scope.go:117] "RemoveContainer" containerID="c427277ee229ee8bf4b08ed4fc1fee3b3314b1536c3138922730181bfc302618" Feb 27 07:57:32 crc kubenswrapper[4725]: I0227 07:57:32.554944 4725 patch_prober.go:28] interesting pod/machine-config-daemon-mg969 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 07:57:32 crc kubenswrapper[4725]: I0227 07:57:32.555654 4725 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 07:57:32 crc kubenswrapper[4725]: I0227 07:57:32.555716 4725 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mg969" Feb 27 07:57:32 crc kubenswrapper[4725]: I0227 07:57:32.556676 4725 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547"} pod="openshift-machine-config-operator/machine-config-daemon-mg969" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 07:57:32 crc kubenswrapper[4725]: I0227 07:57:32.556760 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerName="machine-config-daemon" containerID="cri-o://7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" gracePeriod=600 Feb 27 07:57:32 crc kubenswrapper[4725]: E0227 07:57:32.686659 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:57:33 crc kubenswrapper[4725]: I0227 07:57:33.257386 4725 generic.go:334] "Generic (PLEG): container finished" podID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" exitCode=0 Feb 27 07:57:33 crc kubenswrapper[4725]: I0227 07:57:33.257454 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mg969" event={"ID":"6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198","Type":"ContainerDied","Data":"7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547"} Feb 27 07:57:33 crc kubenswrapper[4725]: I0227 07:57:33.257497 4725 scope.go:117] "RemoveContainer" containerID="0b6a77d2e5351870ca8784a3bea682f8887057ff4350604fd14f640d1e0ac2f3" Feb 27 07:57:33 crc kubenswrapper[4725]: I0227 07:57:33.258203 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:57:33 crc kubenswrapper[4725]: E0227 07:57:33.258626 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:57:46 crc kubenswrapper[4725]: I0227 07:57:46.252236 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:57:46 crc kubenswrapper[4725]: E0227 07:57:46.252993 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.176688 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536318-fv6t2"] Feb 27 07:58:00 crc kubenswrapper[4725]: E0227 07:58:00.178112 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a" containerName="oc" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.178143 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a" containerName="oc" Feb 27 07:58:00 crc kubenswrapper[4725]: E0227 07:58:00.178186 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerName="copy" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.178203 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerName="copy" Feb 27 07:58:00 crc kubenswrapper[4725]: E0227 07:58:00.178260 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerName="gather" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.178278 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerName="gather" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.178817 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerName="gather" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.178856 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae06e21-fee2-4230-82be-fbe8eb29deeb" containerName="copy" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.178902 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3cafeb-6f04-47a6-ac3d-9d0d83b4b72a" containerName="oc" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.180522 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.183314 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.183752 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.184012 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.190769 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536318-fv6t2"] Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.251703 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:58:00 crc kubenswrapper[4725]: E0227 07:58:00.251941 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.293158 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9j2\" (UniqueName: \"kubernetes.io/projected/65c1b6d1-dac5-4f5a-8252-0df826c05feb-kube-api-access-2p9j2\") pod \"auto-csr-approver-29536318-fv6t2\" (UID: \"65c1b6d1-dac5-4f5a-8252-0df826c05feb\") " pod="openshift-infra/auto-csr-approver-29536318-fv6t2" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.395369 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9j2\" (UniqueName: \"kubernetes.io/projected/65c1b6d1-dac5-4f5a-8252-0df826c05feb-kube-api-access-2p9j2\") pod \"auto-csr-approver-29536318-fv6t2\" (UID: \"65c1b6d1-dac5-4f5a-8252-0df826c05feb\") " pod="openshift-infra/auto-csr-approver-29536318-fv6t2" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.420222 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9j2\" (UniqueName: \"kubernetes.io/projected/65c1b6d1-dac5-4f5a-8252-0df826c05feb-kube-api-access-2p9j2\") pod \"auto-csr-approver-29536318-fv6t2\" (UID: \"65c1b6d1-dac5-4f5a-8252-0df826c05feb\") " pod="openshift-infra/auto-csr-approver-29536318-fv6t2" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.516323 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" Feb 27 07:58:00 crc kubenswrapper[4725]: I0227 07:58:00.991174 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536318-fv6t2"] Feb 27 07:58:01 crc kubenswrapper[4725]: I0227 07:58:01.552795 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" event={"ID":"65c1b6d1-dac5-4f5a-8252-0df826c05feb","Type":"ContainerStarted","Data":"fc708847971fdd34c1310833564ead685ec426ee52123b158fe142ef8fef6f64"} Feb 27 07:58:02 crc kubenswrapper[4725]: I0227 07:58:02.561065 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" event={"ID":"65c1b6d1-dac5-4f5a-8252-0df826c05feb","Type":"ContainerStarted","Data":"0b257781a5075ba63168910864f4b9bb0ddf264503ba056d70cba56a4f7168a4"} Feb 27 07:58:02 crc kubenswrapper[4725]: I0227 07:58:02.575307 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" podStartSLOduration=1.3604221490000001 podStartE2EDuration="2.575273682s" podCreationTimestamp="2026-02-27 07:58:00 +0000 UTC" firstStartedPulling="2026-02-27 07:58:00.987546054 +0000 UTC m=+6459.450166633" lastFinishedPulling="2026-02-27 07:58:02.202397597 +0000 UTC m=+6460.665018166" observedRunningTime="2026-02-27 07:58:02.574686136 +0000 UTC m=+6461.037306705" watchObservedRunningTime="2026-02-27 07:58:02.575273682 +0000 UTC m=+6461.037894251" Feb 27 07:58:03 crc kubenswrapper[4725]: I0227 07:58:03.572454 4725 generic.go:334] "Generic (PLEG): container finished" podID="65c1b6d1-dac5-4f5a-8252-0df826c05feb" containerID="0b257781a5075ba63168910864f4b9bb0ddf264503ba056d70cba56a4f7168a4" exitCode=0 Feb 27 07:58:03 crc kubenswrapper[4725]: I0227 07:58:03.572538 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" event={"ID":"65c1b6d1-dac5-4f5a-8252-0df826c05feb","Type":"ContainerDied","Data":"0b257781a5075ba63168910864f4b9bb0ddf264503ba056d70cba56a4f7168a4"} Feb 27 07:58:04 crc kubenswrapper[4725]: I0227 07:58:04.949674 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.092729 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p9j2\" (UniqueName: \"kubernetes.io/projected/65c1b6d1-dac5-4f5a-8252-0df826c05feb-kube-api-access-2p9j2\") pod \"65c1b6d1-dac5-4f5a-8252-0df826c05feb\" (UID: \"65c1b6d1-dac5-4f5a-8252-0df826c05feb\") " Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.098539 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c1b6d1-dac5-4f5a-8252-0df826c05feb-kube-api-access-2p9j2" (OuterVolumeSpecName: "kube-api-access-2p9j2") pod "65c1b6d1-dac5-4f5a-8252-0df826c05feb" (UID: "65c1b6d1-dac5-4f5a-8252-0df826c05feb"). InnerVolumeSpecName "kube-api-access-2p9j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.195746 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p9j2\" (UniqueName: \"kubernetes.io/projected/65c1b6d1-dac5-4f5a-8252-0df826c05feb-kube-api-access-2p9j2\") on node \"crc\" DevicePath \"\"" Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.355747 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536312-nx4j7"] Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.366835 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536312-nx4j7"] Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.590002 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" event={"ID":"65c1b6d1-dac5-4f5a-8252-0df826c05feb","Type":"ContainerDied","Data":"fc708847971fdd34c1310833564ead685ec426ee52123b158fe142ef8fef6f64"} Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.590035 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc708847971fdd34c1310833564ead685ec426ee52123b158fe142ef8fef6f64" Feb 27 07:58:05 crc kubenswrapper[4725]: I0227 07:58:05.590100 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536318-fv6t2" Feb 27 07:58:06 crc kubenswrapper[4725]: I0227 07:58:06.266142 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472f98a7-158b-4367-a5ab-0a5c7362482c" path="/var/lib/kubelet/pods/472f98a7-158b-4367-a5ab-0a5c7362482c/volumes" Feb 27 07:58:14 crc kubenswrapper[4725]: I0227 07:58:14.235569 4725 scope.go:117] "RemoveContainer" containerID="665b45f72b53fbefc52c7f2fb9874bc112f2ac31a05ad0a2edee7500e9332900" Feb 27 07:58:14 crc kubenswrapper[4725]: I0227 07:58:14.259590 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:58:14 crc kubenswrapper[4725]: E0227 07:58:14.260438 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:58:27 crc kubenswrapper[4725]: I0227 07:58:27.251274 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:58:27 crc kubenswrapper[4725]: E0227 07:58:27.252954 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:58:41 crc kubenswrapper[4725]: I0227 07:58:41.251950 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:58:41 crc kubenswrapper[4725]: E0227 07:58:41.252733 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:58:54 crc kubenswrapper[4725]: I0227 07:58:54.251668 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:58:54 crc kubenswrapper[4725]: E0227 07:58:54.252561 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:59:07 crc kubenswrapper[4725]: I0227 07:59:07.253164 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:59:07 crc kubenswrapper[4725]: E0227 07:59:07.255245 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.701750 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pm2r9"] Feb 27 07:59:17 crc kubenswrapper[4725]: E0227 07:59:17.703283 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c1b6d1-dac5-4f5a-8252-0df826c05feb" containerName="oc" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.703329 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c1b6d1-dac5-4f5a-8252-0df826c05feb" containerName="oc" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.703745 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c1b6d1-dac5-4f5a-8252-0df826c05feb" containerName="oc" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.707379 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.717466 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm2r9"] Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.879703 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-utilities\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.880111 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwn8\" (UniqueName: \"kubernetes.io/projected/9802d88e-ae78-4120-827f-7df419da4183-kube-api-access-9vwn8\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.880238 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-catalog-content\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.981815 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-utilities\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.981888 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwn8\" (UniqueName: \"kubernetes.io/projected/9802d88e-ae78-4120-827f-7df419da4183-kube-api-access-9vwn8\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.981933 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-catalog-content\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.982257 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-utilities\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:17 crc kubenswrapper[4725]: I0227 07:59:17.982317 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-catalog-content\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:18 crc kubenswrapper[4725]: I0227 07:59:18.015542 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwn8\" (UniqueName: \"kubernetes.io/projected/9802d88e-ae78-4120-827f-7df419da4183-kube-api-access-9vwn8\") pod \"redhat-marketplace-pm2r9\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:18 crc kubenswrapper[4725]: I0227 07:59:18.050465 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:18 crc kubenswrapper[4725]: I0227 07:59:18.542719 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm2r9"] Feb 27 07:59:19 crc kubenswrapper[4725]: I0227 07:59:19.455768 4725 generic.go:334] "Generic (PLEG): container finished" podID="9802d88e-ae78-4120-827f-7df419da4183" containerID="b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2" exitCode=0 Feb 27 07:59:19 crc kubenswrapper[4725]: I0227 07:59:19.455821 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm2r9" event={"ID":"9802d88e-ae78-4120-827f-7df419da4183","Type":"ContainerDied","Data":"b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2"} Feb 27 07:59:19 crc kubenswrapper[4725]: I0227 07:59:19.455987 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm2r9" event={"ID":"9802d88e-ae78-4120-827f-7df419da4183","Type":"ContainerStarted","Data":"d5b1e4209f071a842a38cdb9fdc118b30ae818d330602b328c34a4e8f03a971a"} Feb 27 07:59:19 crc kubenswrapper[4725]: I0227 07:59:19.458045 4725 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 07:59:21 crc kubenswrapper[4725]: I0227 07:59:21.483076 4725 generic.go:334] "Generic (PLEG): container finished" podID="9802d88e-ae78-4120-827f-7df419da4183" containerID="23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314" exitCode=0 Feb 27 07:59:21 crc kubenswrapper[4725]: I0227 07:59:21.483130 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm2r9" event={"ID":"9802d88e-ae78-4120-827f-7df419da4183","Type":"ContainerDied","Data":"23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314"} Feb 27 07:59:22 crc kubenswrapper[4725]: I0227 07:59:22.258716 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:59:22 crc kubenswrapper[4725]: E0227 07:59:22.259562 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:59:22 crc kubenswrapper[4725]: I0227 07:59:22.493858 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm2r9" event={"ID":"9802d88e-ae78-4120-827f-7df419da4183","Type":"ContainerStarted","Data":"571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1"} Feb 27 07:59:22 crc kubenswrapper[4725]: I0227 07:59:22.519717 4725 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pm2r9" podStartSLOduration=3.067483719 podStartE2EDuration="5.519698663s" podCreationTimestamp="2026-02-27 07:59:17 +0000 UTC" firstStartedPulling="2026-02-27 07:59:19.457793343 +0000 UTC m=+6537.920413912" lastFinishedPulling="2026-02-27 07:59:21.910008257 +0000 UTC m=+6540.372628856" observedRunningTime="2026-02-27 07:59:22.514754983 +0000 UTC m=+6540.977375562" watchObservedRunningTime="2026-02-27 07:59:22.519698663 +0000 UTC m=+6540.982319232" Feb 27 07:59:28 crc kubenswrapper[4725]: I0227 07:59:28.051693 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:28 crc kubenswrapper[4725]: I0227 07:59:28.052000 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:28 crc kubenswrapper[4725]: I0227 07:59:28.105086 4725 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:28 crc kubenswrapper[4725]: I0227 07:59:28.609388 4725 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:28 crc kubenswrapper[4725]: I0227 07:59:28.659904 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm2r9"] Feb 27 07:59:30 crc kubenswrapper[4725]: I0227 07:59:30.576742 4725 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pm2r9" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="registry-server" containerID="cri-o://571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1" gracePeriod=2 Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.113472 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.161265 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-catalog-content\") pod \"9802d88e-ae78-4120-827f-7df419da4183\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.161592 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-utilities\") pod \"9802d88e-ae78-4120-827f-7df419da4183\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.161747 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vwn8\" (UniqueName: \"kubernetes.io/projected/9802d88e-ae78-4120-827f-7df419da4183-kube-api-access-9vwn8\") pod \"9802d88e-ae78-4120-827f-7df419da4183\" (UID: \"9802d88e-ae78-4120-827f-7df419da4183\") " Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.163500 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-utilities" (OuterVolumeSpecName: "utilities") pod "9802d88e-ae78-4120-827f-7df419da4183" (UID: "9802d88e-ae78-4120-827f-7df419da4183"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.169271 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9802d88e-ae78-4120-827f-7df419da4183-kube-api-access-9vwn8" (OuterVolumeSpecName: "kube-api-access-9vwn8") pod "9802d88e-ae78-4120-827f-7df419da4183" (UID: "9802d88e-ae78-4120-827f-7df419da4183"). InnerVolumeSpecName "kube-api-access-9vwn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.196729 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9802d88e-ae78-4120-827f-7df419da4183" (UID: "9802d88e-ae78-4120-827f-7df419da4183"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.265889 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vwn8\" (UniqueName: \"kubernetes.io/projected/9802d88e-ae78-4120-827f-7df419da4183-kube-api-access-9vwn8\") on node \"crc\" DevicePath \"\"" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.265927 4725 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.265942 4725 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9802d88e-ae78-4120-827f-7df419da4183-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.590931 4725 generic.go:334] "Generic (PLEG): container finished" podID="9802d88e-ae78-4120-827f-7df419da4183" containerID="571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1" exitCode=0 Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.591222 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm2r9" event={"ID":"9802d88e-ae78-4120-827f-7df419da4183","Type":"ContainerDied","Data":"571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1"} Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.591248 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm2r9" event={"ID":"9802d88e-ae78-4120-827f-7df419da4183","Type":"ContainerDied","Data":"d5b1e4209f071a842a38cdb9fdc118b30ae818d330602b328c34a4e8f03a971a"} Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.591264 4725 scope.go:117] "RemoveContainer" containerID="571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.591401 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm2r9" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.622981 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm2r9"] Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.631003 4725 scope.go:117] "RemoveContainer" containerID="23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.634530 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm2r9"] Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.656070 4725 scope.go:117] "RemoveContainer" containerID="b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.711406 4725 scope.go:117] "RemoveContainer" containerID="571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1" Feb 27 07:59:31 crc kubenswrapper[4725]: E0227 07:59:31.711879 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1\": container with ID starting with 571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1 not found: ID does not exist" containerID="571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.711933 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1"} err="failed to get container status \"571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1\": rpc error: code = NotFound desc = could not find container \"571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1\": container with ID starting with 571b0f15a2dd1c0248f1a8c38f2cd64e84ac463b14855bb34b6d7a3dc8b791b1 not found: ID does not exist" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.711953 4725 scope.go:117] "RemoveContainer" containerID="23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314" Feb 27 07:59:31 crc kubenswrapper[4725]: E0227 07:59:31.712227 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314\": container with ID starting with 23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314 not found: ID does not exist" containerID="23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.712265 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314"} err="failed to get container status \"23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314\": rpc error: code = NotFound desc = could not find container \"23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314\": container with ID starting with 23ff1d172075b1807cd319407b7ca9c91fc27ac3e3f9eec1f22643705a103314 not found: ID does not exist" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.712281 4725 scope.go:117] "RemoveContainer" containerID="b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2" Feb 27 07:59:31 crc kubenswrapper[4725]: E0227 07:59:31.712557 4725 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2\": container with ID starting with b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2 not found: ID does not exist" containerID="b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2" Feb 27 07:59:31 crc kubenswrapper[4725]: I0227 07:59:31.712584 4725 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2"} err="failed to get container status \"b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2\": rpc error: code = NotFound desc = could not find container \"b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2\": container with ID starting with b00608b7f25bc3b7abc81522ef5ae92add4c11f021199b45f1cae25209d62ba2 not found: ID does not exist" Feb 27 07:59:32 crc kubenswrapper[4725]: I0227 07:59:32.767400 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9802d88e-ae78-4120-827f-7df419da4183" path="/var/lib/kubelet/pods/9802d88e-ae78-4120-827f-7df419da4183/volumes" Feb 27 07:59:34 crc kubenswrapper[4725]: I0227 07:59:34.251832 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:59:34 crc kubenswrapper[4725]: E0227 07:59:34.252332 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 07:59:45 crc kubenswrapper[4725]: I0227 07:59:45.254357 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 07:59:45 crc kubenswrapper[4725]: E0227 07:59:45.255214 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.151174 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536320-2d4mf"] Feb 27 08:00:00 crc kubenswrapper[4725]: E0227 08:00:00.152233 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="extract-content" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.152249 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="extract-content" Feb 27 08:00:00 crc kubenswrapper[4725]: E0227 08:00:00.152275 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="extract-utilities" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.152297 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="extract-utilities" Feb 27 08:00:00 crc kubenswrapper[4725]: E0227 08:00:00.152332 4725 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="registry-server" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.152341 4725 state_mem.go:107] "Deleted CPUSet assignment" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="registry-server" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.152629 4725 memory_manager.go:354] "RemoveStaleState removing state" podUID="9802d88e-ae78-4120-827f-7df419da4183" containerName="registry-server" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.153556 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536320-2d4mf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.157514 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.157926 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jm774" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.158199 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.162642 4725 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf"] Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.164073 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.166912 4725 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.167143 4725 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.174772 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536320-2d4mf"] Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.186612 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf"] Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.251590 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 08:00:00 crc kubenswrapper[4725]: E0227 08:00:00.251864 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.326241 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36160f25-3584-4402-8a25-091de4acb560-config-volume\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.326342 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrkd\" (UniqueName: \"kubernetes.io/projected/36160f25-3584-4402-8a25-091de4acb560-kube-api-access-7qrkd\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.326405 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36160f25-3584-4402-8a25-091de4acb560-secret-volume\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.326542 4725 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6lsm\" (UniqueName: \"kubernetes.io/projected/49fbb7db-0deb-4520-88f6-e51794540d3e-kube-api-access-n6lsm\") pod \"auto-csr-approver-29536320-2d4mf\" (UID: \"49fbb7db-0deb-4520-88f6-e51794540d3e\") " pod="openshift-infra/auto-csr-approver-29536320-2d4mf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.429358 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6lsm\" (UniqueName: \"kubernetes.io/projected/49fbb7db-0deb-4520-88f6-e51794540d3e-kube-api-access-n6lsm\") pod \"auto-csr-approver-29536320-2d4mf\" (UID: \"49fbb7db-0deb-4520-88f6-e51794540d3e\") " pod="openshift-infra/auto-csr-approver-29536320-2d4mf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.430055 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36160f25-3584-4402-8a25-091de4acb560-config-volume\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.431108 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36160f25-3584-4402-8a25-091de4acb560-config-volume\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.431572 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrkd\" (UniqueName: \"kubernetes.io/projected/36160f25-3584-4402-8a25-091de4acb560-kube-api-access-7qrkd\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.432257 4725 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36160f25-3584-4402-8a25-091de4acb560-secret-volume\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.440797 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36160f25-3584-4402-8a25-091de4acb560-secret-volume\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.450431 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrkd\" (UniqueName: \"kubernetes.io/projected/36160f25-3584-4402-8a25-091de4acb560-kube-api-access-7qrkd\") pod \"collect-profiles-29536320-dvcpf\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.454031 4725 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6lsm\" (UniqueName: \"kubernetes.io/projected/49fbb7db-0deb-4520-88f6-e51794540d3e-kube-api-access-n6lsm\") pod \"auto-csr-approver-29536320-2d4mf\" (UID: \"49fbb7db-0deb-4520-88f6-e51794540d3e\") " pod="openshift-infra/auto-csr-approver-29536320-2d4mf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.480069 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536320-2d4mf" Feb 27 08:00:00 crc kubenswrapper[4725]: I0227 08:00:00.493521 4725 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:01 crc kubenswrapper[4725]: I0227 08:00:01.036445 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf"] Feb 27 08:00:01 crc kubenswrapper[4725]: I0227 08:00:01.044999 4725 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536320-2d4mf"] Feb 27 08:00:01 crc kubenswrapper[4725]: I0227 08:00:01.064590 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" event={"ID":"36160f25-3584-4402-8a25-091de4acb560","Type":"ContainerStarted","Data":"3a42f82295899bdd80912745a714aaf2a4b56d202f2d10c290f5165ede034e77"} Feb 27 08:00:01 crc kubenswrapper[4725]: I0227 08:00:01.066396 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536320-2d4mf" event={"ID":"49fbb7db-0deb-4520-88f6-e51794540d3e","Type":"ContainerStarted","Data":"750cb3a46895969e628a87f6fa4fb9c30c63aed0e2e93c70908b4ac06ac65f01"} Feb 27 08:00:02 crc kubenswrapper[4725]: I0227 08:00:02.086676 4725 generic.go:334] "Generic (PLEG): container finished" podID="36160f25-3584-4402-8a25-091de4acb560" containerID="93ca7df82247c24fdf4d4c8ae7faa93f15ab5e9ccd441aa109692c9e6419058a" exitCode=0 Feb 27 08:00:02 crc kubenswrapper[4725]: I0227 08:00:02.086768 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" event={"ID":"36160f25-3584-4402-8a25-091de4acb560","Type":"ContainerDied","Data":"93ca7df82247c24fdf4d4c8ae7faa93f15ab5e9ccd441aa109692c9e6419058a"} Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.495594 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.600762 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36160f25-3584-4402-8a25-091de4acb560-secret-volume\") pod \"36160f25-3584-4402-8a25-091de4acb560\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.600844 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36160f25-3584-4402-8a25-091de4acb560-config-volume\") pod \"36160f25-3584-4402-8a25-091de4acb560\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.601025 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qrkd\" (UniqueName: \"kubernetes.io/projected/36160f25-3584-4402-8a25-091de4acb560-kube-api-access-7qrkd\") pod \"36160f25-3584-4402-8a25-091de4acb560\" (UID: \"36160f25-3584-4402-8a25-091de4acb560\") " Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.601576 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36160f25-3584-4402-8a25-091de4acb560-config-volume" (OuterVolumeSpecName: "config-volume") pod "36160f25-3584-4402-8a25-091de4acb560" (UID: "36160f25-3584-4402-8a25-091de4acb560"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.606386 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36160f25-3584-4402-8a25-091de4acb560-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36160f25-3584-4402-8a25-091de4acb560" (UID: "36160f25-3584-4402-8a25-091de4acb560"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.606526 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36160f25-3584-4402-8a25-091de4acb560-kube-api-access-7qrkd" (OuterVolumeSpecName: "kube-api-access-7qrkd") pod "36160f25-3584-4402-8a25-091de4acb560" (UID: "36160f25-3584-4402-8a25-091de4acb560"). InnerVolumeSpecName "kube-api-access-7qrkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.703020 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qrkd\" (UniqueName: \"kubernetes.io/projected/36160f25-3584-4402-8a25-091de4acb560-kube-api-access-7qrkd\") on node \"crc\" DevicePath \"\"" Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.703429 4725 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36160f25-3584-4402-8a25-091de4acb560-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 08:00:03 crc kubenswrapper[4725]: I0227 08:00:03.703439 4725 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36160f25-3584-4402-8a25-091de4acb560-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 08:00:04 crc kubenswrapper[4725]: I0227 08:00:04.111750 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" event={"ID":"36160f25-3584-4402-8a25-091de4acb560","Type":"ContainerDied","Data":"3a42f82295899bdd80912745a714aaf2a4b56d202f2d10c290f5165ede034e77"} Feb 27 08:00:04 crc kubenswrapper[4725]: I0227 08:00:04.111789 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a42f82295899bdd80912745a714aaf2a4b56d202f2d10c290f5165ede034e77" Feb 27 08:00:04 crc kubenswrapper[4725]: I0227 08:00:04.111849 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536320-dvcpf" Feb 27 08:00:04 crc kubenswrapper[4725]: I0227 08:00:04.574047 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f"] Feb 27 08:00:04 crc kubenswrapper[4725]: I0227 08:00:04.586391 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536275-7sp8f"] Feb 27 08:00:06 crc kubenswrapper[4725]: I0227 08:00:06.266930 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a50807f4-a5f2-4bfe-ae16-317194fe97da" path="/var/lib/kubelet/pods/a50807f4-a5f2-4bfe-ae16-317194fe97da/volumes" Feb 27 08:00:07 crc kubenswrapper[4725]: I0227 08:00:07.166399 4725 generic.go:334] "Generic (PLEG): container finished" podID="49fbb7db-0deb-4520-88f6-e51794540d3e" containerID="7606456a0cd84f568aabebc4b149a9aad1e30afdd816333ee58fff00e7a7a69c" exitCode=0 Feb 27 08:00:07 crc kubenswrapper[4725]: I0227 08:00:07.166475 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536320-2d4mf" event={"ID":"49fbb7db-0deb-4520-88f6-e51794540d3e","Type":"ContainerDied","Data":"7606456a0cd84f568aabebc4b149a9aad1e30afdd816333ee58fff00e7a7a69c"} Feb 27 08:00:08 crc kubenswrapper[4725]: I0227 08:00:08.593492 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536320-2d4mf" Feb 27 08:00:08 crc kubenswrapper[4725]: I0227 08:00:08.634053 4725 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6lsm\" (UniqueName: \"kubernetes.io/projected/49fbb7db-0deb-4520-88f6-e51794540d3e-kube-api-access-n6lsm\") pod \"49fbb7db-0deb-4520-88f6-e51794540d3e\" (UID: \"49fbb7db-0deb-4520-88f6-e51794540d3e\") " Feb 27 08:00:08 crc kubenswrapper[4725]: I0227 08:00:08.641581 4725 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49fbb7db-0deb-4520-88f6-e51794540d3e-kube-api-access-n6lsm" (OuterVolumeSpecName: "kube-api-access-n6lsm") pod "49fbb7db-0deb-4520-88f6-e51794540d3e" (UID: "49fbb7db-0deb-4520-88f6-e51794540d3e"). InnerVolumeSpecName "kube-api-access-n6lsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 08:00:08 crc kubenswrapper[4725]: I0227 08:00:08.736976 4725 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6lsm\" (UniqueName: \"kubernetes.io/projected/49fbb7db-0deb-4520-88f6-e51794540d3e-kube-api-access-n6lsm\") on node \"crc\" DevicePath \"\"" Feb 27 08:00:09 crc kubenswrapper[4725]: I0227 08:00:09.198404 4725 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536320-2d4mf" event={"ID":"49fbb7db-0deb-4520-88f6-e51794540d3e","Type":"ContainerDied","Data":"750cb3a46895969e628a87f6fa4fb9c30c63aed0e2e93c70908b4ac06ac65f01"} Feb 27 08:00:09 crc kubenswrapper[4725]: I0227 08:00:09.198463 4725 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750cb3a46895969e628a87f6fa4fb9c30c63aed0e2e93c70908b4ac06ac65f01" Feb 27 08:00:09 crc kubenswrapper[4725]: I0227 08:00:09.198486 4725 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536320-2d4mf" Feb 27 08:00:09 crc kubenswrapper[4725]: I0227 08:00:09.653622 4725 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536314-wsdjz"] Feb 27 08:00:09 crc kubenswrapper[4725]: I0227 08:00:09.662734 4725 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536314-wsdjz"] Feb 27 08:00:10 crc kubenswrapper[4725]: I0227 08:00:10.271848 4725 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d90dce9-f9eb-4200-99c0-d9f523d57587" path="/var/lib/kubelet/pods/6d90dce9-f9eb-4200-99c0-d9f523d57587/volumes" Feb 27 08:00:11 crc kubenswrapper[4725]: I0227 08:00:11.252495 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 08:00:11 crc kubenswrapper[4725]: E0227 08:00:11.252891 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 08:00:14 crc kubenswrapper[4725]: I0227 08:00:14.342001 4725 scope.go:117] "RemoveContainer" containerID="09acf6b11d4bcab27ee07e5cd3f53fcbecc3ca1f0d8958a53ec444716c933e1e" Feb 27 08:00:14 crc kubenswrapper[4725]: I0227 08:00:14.373633 4725 scope.go:117] "RemoveContainer" containerID="58291aa0682e0ae52da72d5740d8e9d00b64cb47de91294af3b610a861bc5af1" Feb 27 08:00:25 crc kubenswrapper[4725]: I0227 08:00:25.252079 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 08:00:25 crc kubenswrapper[4725]: E0227 08:00:25.253004 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198" Feb 27 08:00:39 crc kubenswrapper[4725]: I0227 08:00:39.252418 4725 scope.go:117] "RemoveContainer" containerID="7c30391cb4f7aecad87f18fd059fea8e8c7d70149c7f60ad81d793d6c681c547" Feb 27 08:00:39 crc kubenswrapper[4725]: E0227 08:00:39.253625 4725 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mg969_openshift-machine-config-operator(6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198)\"" pod="openshift-machine-config-operator/machine-config-daemon-mg969" podUID="6c825ee8-1ec6-4b76-9fdc-1f5dab0b3198"